Container: container_1529349239295_0005_01_000132 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20914 Log Contents: 2018-06-18 19:51:13,266 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:51:14,469 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000132/tmp as the basepath for spooling. 2018-06-18 19:51:14,473 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:41935 2018-06-18 19:51:15,554 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:51:15,585 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41935/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:51:15,644 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:51:15,705 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000132/tmp/chkp6902384025681675615 as the basepath for checkpointing. 2018-06-18 19:51:15,718 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:51:15,928 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@1bbd5a6b for node 2 2018-06-18 19:51:15,958 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:51:15,958 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:51:17,678 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:51:17,681 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:51:17,685 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@e1cdb40identifier=tcp://laptop-name:41935/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2b04fcf5{da=com.datatorrent.bufferserver.internal.DataList$Block@7c0d2d90{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1754, starting_window=5b28089000000001, ending_window=5b28089000000010, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@a0f8b5c[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000099 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20914 Log Contents: 2018-06-18 19:46:09,622 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:46:10,810 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000099/tmp as the basepath for spooling. 2018-06-18 19:46:10,814 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:38991 2018-06-18 19:46:11,915 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:46:12,010 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:46:12,074 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:38991/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:46:12,082 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000099/tmp/chkp175981037615792854 as the basepath for checkpointing. 2018-06-18 19:46:12,097 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:46:12,218 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@416888cb for node 2 2018-06-18 19:46:12,340 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:46:12,341 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:46:14,048 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:46:14,052 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:46:14,055 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@337a5942identifier=tcp://laptop-name:38991/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5d0d237f{da=com.datatorrent.bufferserver.internal.DataList$Block@30c5cba7{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=688, starting_window=5b28089000000001, ending_window=5b28089000000009, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@4e5cc0d8[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000066 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22064 Log Contents: 2018-06-18 19:41:08,881 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:41:10,172 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000066/tmp as the basepath for spooling. 2018-06-18 19:41:10,175 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:35367 2018-06-18 19:41:11,248 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:41:11,345 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:41:11,423 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000066/tmp/chkp4557854619611284155 as the basepath for checkpointing. 2018-06-18 19:41:11,440 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:41:11,552 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@62170dd0 for node 2 2018-06-18 19:41:11,552 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.io.InterruptedIOException: Interrupted while waiting for IO on channel java.nio.channels.SocketChannel[connected local=/127.0.0.1:42138 remote=/127.0.0.1:50010]. 65000 millis timeout left. at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:342) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:118) at java.io.FilterInputStream.read(FilterInputStream.java:83) at java.io.FilterInputStream.read(FilterInputStream.java:83) at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:2280) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1343) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:41:11,553 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745095_4271 2018-06-18 19:41:11,556 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:41:11,638 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:35367/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:41:13,375 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:41:13,376 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:41:13,378 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@13941cefidentifier=tcp://laptop-name:35367/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@550fb0db{da=com.datatorrent.bufferserver.internal.DataList$Block@682c42f6{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1754, starting_window=5b28089000000001, ending_window=5b28089000000010, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@17abb79a[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000041 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:37:20,985 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:37:22,142 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000041/tmp as the basepath for spooling. 2018-06-18 19:37:22,146 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:38133 2018-06-18 19:37:23,224 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:37:23,299 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:37:23,371 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000041/tmp/chkp5228734534520626221 as the basepath for checkpointing. 2018-06-18 19:37:23,390 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:37:23,430 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:38133/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:37:23,499 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@2cd8efaa for node 2 2018-06-18 19:37:23,503 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:37:23,504 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744617_3793 2018-06-18 19:37:23,506 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:37:23,507 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:37:25,339 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:37:25,342 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:37:25,347 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@1f9dd075identifier=tcp://laptop-name:38133/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1eae6a1{da=com.datatorrent.bufferserver.internal.DataList$Block@38034004{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=3518, starting_window=5b28089000000001, ending_window=5b28089000000018, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@7c1d2445[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000008 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:32:19,751 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:32:20,918 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000008/tmp as the basepath for spooling. 2018-06-18 19:32:20,922 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:39725 2018-06-18 19:32:21,995 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:32:22,085 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:32:22,160 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000008/tmp/chkp1484180315080426514 as the basepath for checkpointing. 2018-06-18 19:32:22,165 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:32:22,285 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@5612747d for node 2 2018-06-18 19:32:22,287 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:32:22,287 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073743986_3162 2018-06-18 19:32:22,289 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:32:22,290 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:32:22,389 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:39725/2.out.1, windowId=5b28089000000011, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:32:24,105 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:32:24,106 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:32:24,107 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@191e1b11identifier=tcp://laptop-name:39725/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1d16c736{da=com.datatorrent.bufferserver.internal.DataList$Block@108a4b31{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=367, starting_window=5b28089000000001, ending_window=5b28089000000006, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@318909e5[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000124 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23944 Log Contents: 2018-06-18 19:50:00,461 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:50:01,822 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000124/tmp as the basepath for spooling. 2018-06-18 19:50:01,831 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:43191 2018-06-18 19:50:02,937 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:50:03,054 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:50:03,141 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000124/tmp/chkp7419753766229647676 as the basepath for checkpointing. 2018-06-18 19:50:03,171 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:50:03,226 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:43191/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:50:03,279 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@18b3a39 for node 2 2018-06-18 19:50:03,285 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:50:03,286 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746210_5386 2018-06-18 19:50:03,288 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:50:03,289 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:50:05,106 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:50:05,107 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:50:05,110 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@add9926identifier=tcp://laptop-name:43191/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@138bd3f5{da=com.datatorrent.bufferserver.internal.DataList$Block@759f212{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=5534, starting_window=5b28089000000001, ending_window=5b2808900000001f, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@6ac6eadb[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000091 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:19006 Log Contents: 2018-06-18 19:44:56,756 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:44:57,985 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000091/tmp as the basepath for spooling. 2018-06-18 19:44:57,991 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:44421 2018-06-18 19:44:59,062 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:44:59,148 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:44:59,225 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:44:59,259 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:44421/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:45:01,178 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:45:01,181 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:45:01,188 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@37595397identifier=tcp://laptop-name:44421/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7d6aab23{da=com.datatorrent.bufferserver.internal.DataList$Block@7b631e54{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=12, starting_window=5b28089000000001, ending_window=5b28089000000001, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@6f72586a[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000058 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:19005 Log Contents: 2018-06-18 19:39:55,904 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:39:57,045 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000058/tmp as the basepath for spooling. 2018-06-18 19:39:57,048 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:33033 2018-06-18 19:39:58,151 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:39:58,233 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:33033/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:39:58,235 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:39:58,314 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:40:00,255 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:40:00,258 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:40:00,264 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@391eafc2identifier=tcp://laptop-name:33033/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@c3925ba{da=com.datatorrent.bufferserver.internal.DataList$Block@55d68f11{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=12, starting_window=5b28089000000001, ending_window=5b28089000000001, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@7747633f[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000033 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22167 Log Contents: 2018-06-18 19:36:08,007 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:36:09,169 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000033/tmp as the basepath for spooling. 2018-06-18 19:36:09,173 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:37053 2018-06-18 19:36:10,246 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:36:10,335 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:36:10,407 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000033/tmp/chkp5976232893840099147 as the basepath for checkpointing. 2018-06-18 19:36:10,417 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:36:10,498 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:37053/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:36:10,535 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@635463d3 for node 2 2018-06-18 19:36:10,551 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:36:10,552 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744463_3639 2018-06-18 19:36:10,555 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:36:12,363 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:36:12,366 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:36:12,370 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@3f7ce648identifier=tcp://laptop-name:37053/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7be6f60c{da=com.datatorrent.bufferserver.internal.DataList$Block@3dcd9786{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1090, starting_window=5b28089000000001, ending_window=5b2808900000000c, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@2745413c[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000116 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22166 Log Contents: 2018-06-18 19:48:46,491 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:48:47,638 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000116/tmp as the basepath for spooling. 2018-06-18 19:48:47,641 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:37021 2018-06-18 19:48:48,711 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:48:48,799 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:48:48,827 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:37021/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:48:48,884 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000116/tmp/chkp4831267988258901925 as the basepath for checkpointing. 2018-06-18 19:48:48,892 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:48:49,010 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@7ff74487 for node 2 2018-06-18 19:48:49,018 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:48:49,019 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746054_5230 2018-06-18 19:48:49,022 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:48:50,827 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:48:50,837 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@516eca84identifier=tcp://laptop-name:37021/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@380618db{da=com.datatorrent.bufferserver.internal.DataList$Block@640a6105{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=198, starting_window=5b28089000000001, ending_window=5b28089000000004, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@56aaded3[identifier=2.out.1] 2018-06-18 19:48:50,838 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000083 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23945 Log Contents: 2018-06-18 19:43:43,892 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:43:45,126 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000083/tmp as the basepath for spooling. 2018-06-18 19:43:45,130 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:39855 2018-06-18 19:43:46,213 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:43:46,305 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:43:46,374 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000083/tmp/chkp1185718175798706581 as the basepath for checkpointing. 2018-06-18 19:43:46,380 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:43:46,404 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:39855/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:43:46,499 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@4373019 for node 2 2018-06-18 19:43:46,508 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:43:46,508 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745420_4596 2018-06-18 19:43:46,511 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:43:46,512 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:43:48,334 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:43:48,337 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:43:48,343 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@4413e99fidentifier=tcp://laptop-name:39855/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@649e17de{da=com.datatorrent.bufferserver.internal.DataList$Block@2accef54{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=465, starting_window=5b28089000000001, ending_window=5b28089000000007, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@2c195f8a[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000050 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22061 Log Contents: 2018-06-18 19:38:42,967 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:38:44,145 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000050/tmp as the basepath for spooling. 2018-06-18 19:38:44,150 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:43827 2018-06-18 19:38:45,242 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:38:45,342 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:43827/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:38:45,375 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:38:45,437 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000050/tmp/chkp778745596695231180 as the basepath for checkpointing. 2018-06-18 19:38:45,445 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:38:45,567 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@62b3c1a4 for node 2 2018-06-18 19:38:45,568 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.io.InterruptedIOException: Interrupted while waiting for IO on channel java.nio.channels.SocketChannel[connected local=/127.0.0.1:41202 remote=/127.0.0.1:50010]. 65000 millis timeout left. at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:342) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:118) at java.io.FilterInputStream.read(FilterInputStream.java:83) at java.io.FilterInputStream.read(FilterInputStream.java:83) at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:2280) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1343) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:38:45,569 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744790_3966 2018-06-18 19:38:45,572 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:38:47,406 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:38:47,408 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:38:47,412 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@3c0b6d9eidentifier=tcp://laptop-name:43827/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@26568660{da=com.datatorrent.bufferserver.internal.DataList$Block@4f91f16f{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=572, starting_window=5b28089000000001, ending_window=5b28089000000008, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@73d9221[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000025 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23945 Log Contents: 2018-06-18 19:34:54,974 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:34:56,290 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000025/tmp as the basepath for spooling. 2018-06-18 19:34:56,294 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:35309 2018-06-18 19:34:57,362 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:34:57,462 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:34:57,548 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000025/tmp/chkp1130540293761127430 as the basepath for checkpointing. 2018-06-18 19:34:57,557 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:34:57,583 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:35309/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:34:57,685 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@6997bac6 for node 2 2018-06-18 19:34:57,699 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:34:57,702 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744312_3488 2018-06-18 19:34:57,704 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:34:57,706 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:34:59,493 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:34:59,501 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:34:59,504 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@4d9516e5identifier=tcp://laptop-name:35309/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@62e11582{da=com.datatorrent.bufferserver.internal.DataList$Block@7b011eae{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=65, starting_window=5b28089000000001, ending_window=5b28089000000002, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@6b3cdf3f[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000141 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20917 Log Contents: 2018-06-18 19:52:35,353 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:52:36,541 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000141/tmp as the basepath for spooling. 2018-06-18 19:52:36,544 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:38057 2018-06-18 19:52:37,617 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:52:37,708 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:52:37,781 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000141/tmp/chkp7642474101649989684 as the basepath for checkpointing. 2018-06-18 19:52:37,819 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:52:37,986 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@1de29260 for node 2 2018-06-18 19:52:37,986 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:38057/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:52:38,017 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:52:38,017 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:52:39,757 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:52:39,760 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:52:39,765 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@3781c95fidentifier=tcp://laptop-name:38057/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@24904fc7{da=com.datatorrent.bufferserver.internal.DataList$Block@3605f32e{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=14228, starting_window=5b28089000000001, ending_window=5b28089000000034, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@78c482af[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000108 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22164 Log Contents: 2018-06-18 19:47:31,928 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:47:34,090 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000108/tmp as the basepath for spooling. 2018-06-18 19:47:34,095 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:46427 2018-06-18 19:47:35,187 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:47:35,312 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:47:35,408 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000108/tmp/chkp569041484266283310 as the basepath for checkpointing. 2018-06-18 19:47:35,415 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:47:35,464 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:46427/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:47:35,522 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@282cb7c5 for node 2 2018-06-18 19:47:35,583 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:47:35,586 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745900_5076 2018-06-18 19:47:35,590 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:47:37,336 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:47:37,339 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:47:37,343 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@447e338didentifier=tcp://laptop-name:46427/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@ec1eb92{da=com.datatorrent.bufferserver.internal.DataList$Block@15e35971{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=572, starting_window=5b28089000000001, ending_window=5b28089000000008, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@20b48e4f[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000075 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20916 Log Contents: 2018-06-18 19:42:30,943 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:42:32,129 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000075/tmp as the basepath for spooling. 2018-06-18 19:42:32,133 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:41833 2018-06-18 19:42:33,230 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:42:33,357 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:42:33,429 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000075/tmp/chkp3933839359967042554 as the basepath for checkpointing. 2018-06-18 19:42:33,442 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:42:33,561 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@1ea63b5b for node 2 2018-06-18 19:42:33,649 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41833/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:42:33,674 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:42:33,674 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:42:35,384 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:42:35,386 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:42:35,390 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7b83f61eidentifier=tcp://laptop-name:41833/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@295101e9{da=com.datatorrent.bufferserver.internal.DataList$Block@396627b0{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1242, starting_window=5b28089000000001, ending_window=5b2808900000000d, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@70cc3577[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000042 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20916 Log Contents: 2018-06-18 19:37:30,091 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:37:31,231 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000042/tmp as the basepath for spooling. 2018-06-18 19:37:31,234 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:45203 2018-06-18 19:37:32,330 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:37:32,414 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:37:32,477 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:45203/2.out.1, windowId=5b28089000000016, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:37:32,492 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000042/tmp/chkp5720430193469078483 as the basepath for checkpointing. 2018-06-18 19:37:32,511 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:37:32,700 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@422a317f for node 2 2018-06-18 19:37:32,724 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:37:32,724 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:37:34,455 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:37:34,458 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:37:34,462 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@69bdc29fidentifier=tcp://laptop-name:45203/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3cd626f3{da=com.datatorrent.bufferserver.internal.DataList$Block@66bb5f4b{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=2789, starting_window=5b28089000000001, ending_window=5b28089000000015, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@16a58456[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000017 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:33:42,085 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:33:43,289 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000017/tmp as the basepath for spooling. 2018-06-18 19:33:43,292 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:42397 2018-06-18 19:33:44,373 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:33:44,462 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:33:44,532 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000017/tmp/chkp6272639907744616156 as the basepath for checkpointing. 2018-06-18 19:33:44,547 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:33:44,556 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:42397/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:33:44,662 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@238d40b1 for node 2 2018-06-18 19:33:44,672 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:33:44,673 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744159_3335 2018-06-18 19:33:44,675 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:33:44,677 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:33:46,496 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:33:46,499 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:33:46,504 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@60a23b48identifier=tcp://laptop-name:42397/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@59336757{da=com.datatorrent.bufferserver.internal.DataList$Block@7b4e9952{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1943, starting_window=5b28089000000001, ending_window=5b28089000000011, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@f163f31[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000133 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:51:22,367 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:51:23,516 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000133/tmp as the basepath for spooling. 2018-06-18 19:51:23,520 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:43987 2018-06-18 19:51:24,611 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:51:24,649 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:43987/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:51:24,735 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:51:24,793 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000133/tmp/chkp2207852994103017043 as the basepath for checkpointing. 2018-06-18 19:51:24,802 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:51:24,923 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@62c2bd36 for node 2 2018-06-18 19:51:24,929 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:51:24,930 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746382_5558 2018-06-18 19:51:24,931 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:51:24,932 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:51:26,765 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:51:26,767 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:51:26,771 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@513aa349identifier=tcp://laptop-name:43987/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3e9dd91b{da=com.datatorrent.bufferserver.internal.DataList$Block@6d47d0b7{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=572, starting_window=5b28089000000001, ending_window=5b28089000000008, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@3f5ab8db[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000100 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23947 Log Contents: 2018-06-18 19:46:18,929 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:46:20,128 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000100/tmp as the basepath for spooling. 2018-06-18 19:46:20,131 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:37933 2018-06-18 19:46:21,222 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:46:21,310 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:46:21,397 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000100/tmp/chkp4435630068779988476 as the basepath for checkpointing. 2018-06-18 19:46:21,408 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:46:21,523 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@722fb45c for node 2 2018-06-18 19:46:21,531 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:46:21,531 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745744_4920 2018-06-18 19:46:21,533 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:46:21,534 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:46:21,602 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:37933/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:46:23,348 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:46:23,350 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:46:23,357 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7e7284a3identifier=tcp://laptop-name:37933/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@67c528cc{da=com.datatorrent.bufferserver.internal.DataList$Block@67a8b82f{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1242, starting_window=5b28089000000001, ending_window=5b2808900000000d, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@4b0a26ed[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000067 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:19005 Log Contents: 2018-06-18 19:41:18,039 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:41:19,212 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000067/tmp as the basepath for spooling. 2018-06-18 19:41:19,215 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:36043 2018-06-18 19:41:20,289 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:41:20,390 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:41:20,456 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:41:20,689 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:36043/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:41:22,411 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:41:22,414 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:41:22,422 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6757e52fidentifier=tcp://laptop-name:36043/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@580027d0{da=com.datatorrent.bufferserver.internal.DataList$Block@7de5f17f{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=12, starting_window=5b28089000000001, ending_window=5b28089000000001, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@af4db39[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000034 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22163 Log Contents: 2018-06-18 19:36:17,108 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:36:18,278 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000034/tmp as the basepath for spooling. 2018-06-18 19:36:18,282 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:36259 2018-06-18 19:36:19,350 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:36:19,423 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:36:19,499 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000034/tmp/chkp776772293912355618 as the basepath for checkpointing. 2018-06-18 19:36:19,502 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:36:19,551 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:36259/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:36:19,623 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@49ab1292 for node 2 2018-06-18 19:36:19,633 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:36:19,634 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744482_3658 2018-06-18 19:36:19,637 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:36:21,451 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:36:21,453 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:36:21,458 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@a9e869cidentifier=tcp://laptop-name:36259/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1451dfc6{da=com.datatorrent.bufferserver.internal.DataList$Block@2f6e02b3{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=65, starting_window=5b28089000000001, ending_window=5b28089000000002, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@45fbf1ff[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000009 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23944 Log Contents: 2018-06-18 19:32:28,883 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:32:30,182 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000009/tmp as the basepath for spooling. 2018-06-18 19:32:30,186 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:38157 2018-06-18 19:32:31,333 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:32:31,484 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:38157/2.out.1, windowId=5b28089000000011, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:32:31,492 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:32:31,600 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000009/tmp/chkp6880806443918471532 as the basepath for checkpointing. 2018-06-18 19:32:31,607 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:32:31,728 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@d04a0e5 for node 2 2018-06-18 19:32:31,748 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:32:31,749 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744006_3182 2018-06-18 19:32:31,750 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:32:31,751 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:32:33,530 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:32:33,538 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:32:33,540 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@205d5faaidentifier=tcp://laptop-name:38157/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@66bb2237{da=com.datatorrent.bufferserver.internal.DataList$Block@16e91c13{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=465, starting_window=5b28089000000001, ending_window=5b28089000000007, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@d6acd86[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000125 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22164 Log Contents: 2018-06-18 19:50:09,515 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:50:10,778 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000125/tmp as the basepath for spooling. 2018-06-18 19:50:10,782 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:33223 2018-06-18 19:50:11,889 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:50:11,999 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:50:12,072 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000125/tmp/chkp3315190681254930621 as the basepath for checkpointing. 2018-06-18 19:50:12,077 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:50:12,203 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@6e082a53 for node 2 2018-06-18 19:50:12,218 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:50:12,219 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746229_5405 2018-06-18 19:50:12,222 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:50:12,261 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:33223/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:50:14,015 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:50:14,017 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:50:14,021 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@1e56c9fdidentifier=tcp://laptop-name:33223/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@b782657{da=com.datatorrent.bufferserver.internal.DataList$Block@7adeec25{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=198, starting_window=5b28089000000001, ending_window=5b28089000000004, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@7afb7ba[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000092 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:45:05,933 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:45:07,110 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000092/tmp as the basepath for spooling. 2018-06-18 19:45:07,114 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:36599 2018-06-18 19:45:08,188 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:45:08,276 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:36599/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:45:08,296 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:45:08,383 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000092/tmp/chkp5867289557240395817 as the basepath for checkpointing. 2018-06-18 19:45:08,389 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:45:08,510 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@7efc047c for node 2 2018-06-18 19:45:08,523 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:45:08,523 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745591_4767 2018-06-18 19:45:08,526 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:45:08,527 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:45:10,327 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:45:10,330 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:45:10,336 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@5bcb8a62identifier=tcp://laptop-name:36599/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7df80992{da=com.datatorrent.bufferserver.internal.DataList$Block@14640e46{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=278, starting_window=5b28089000000001, ending_window=5b28089000000005, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@755bf8a3[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000059 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:19005 Log Contents: 2018-06-18 19:40:05,039 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:40:06,260 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000059/tmp as the basepath for spooling. 2018-06-18 19:40:06,263 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:38771 2018-06-18 19:40:07,371 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:40:07,493 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:40:07,560 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:40:07,798 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:38771/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:40:09,516 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:40:09,518 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:40:09,523 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@d806ee2identifier=tcp://laptop-name:38771/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5d9d037d{da=com.datatorrent.bufferserver.internal.DataList$Block@488b3c6f{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=12, starting_window=5b28089000000001, ending_window=5b28089000000001, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@6c20a7fb[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000026 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:19004 Log Contents: 2018-06-18 19:35:04,097 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:35:05,259 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000026/tmp as the basepath for spooling. 2018-06-18 19:35:05,263 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:41063 2018-06-18 19:35:06,346 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:35:06,440 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:35:06,508 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:35:06,620 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41063/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:35:08,460 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:35:08,463 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:35:08,467 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@53dc9886identifier=tcp://laptop-name:41063/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6484767a{da=com.datatorrent.bufferserver.internal.DataList$Block@7485a6{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=12, starting_window=5b28089000000001, ending_window=5b28089000000001, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@34966eab[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000001 on localhost_40317 ====================================================================== LogType:AppMaster.stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:1866 Log Contents: Jun 18, 2018 7:31:30 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register INFO: Registering com.datatorrent.stram.webapp.StramWebApp$JAXBContextResolver as a provider class Jun 18, 2018 7:31:30 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class Jun 18, 2018 7:31:30 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register INFO: Registering com.datatorrent.stram.webapp.WebServices as a root resource class Jun 18, 2018 7:31:30 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register INFO: Registering com.datatorrent.stram.webapp.StramWebServices as a root resource class Jun 18, 2018 7:31:30 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM' Jun 18, 2018 7:31:30 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider INFO: Binding com.datatorrent.stram.webapp.StramWebApp$JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton" Jun 18, 2018 7:31:30 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton" Jun 18, 2018 7:31:30 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider INFO: Binding com.datatorrent.stram.webapp.WebServices to GuiceManagedComponentProvider with the scope "PerRequest" Jun 18, 2018 7:31:30 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider INFO: Binding com.datatorrent.stram.webapp.StramWebServices to GuiceManagedComponentProvider with the scope "Singleton" End of LogType:AppMaster.stderr LogType:AppMaster.stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:AppMaster.stdout LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:874563 Log Contents: 2018-06-18 19:31:27,125 INFO com.datatorrent.stram.StreamingAppMaster: Master starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar 2018-06-18 19:31:27,129 INFO com.datatorrent.stram.StreamingAppMaster: version: 3.7.0 from rev: cd0b0d9 branch: cd0b0d9f31b3a198425440b66c52802d1e592b4e by Pramod Immaneni on 14.04.2018 @ 08:03:49 PDT 2018-06-18 19:31:27,130 INFO com.datatorrent.stram.StreamingAppMaster: appmaster env: PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games HADOOP_CONF_DIR=/etc/hadoop/conf MAX_APP_ATTEMPTS=2 SUDO_USER=apex MAIL=/var/mail/yarn LD_LIBRARY_PATH=:/usr/lib/hadoop/lib/native:/usr/lib/hadoop/lib/native HADOOP_HOME_WARN_SUPPRESS=true USERNAME=root LOGNAME=apex JVM_PID=21531 JSVC_HOME=/usr/lib/bigtop-utils PWD=/var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000001 HADOOP_YARN_USER=yarn HADOOP_PREFIX=/usr/lib/hadoop LOCAL_DIRS=/var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005 YARN_IDENT_STRING=yarn SHELL=/bin/bash YARN_CONF_DIR=/etc/hadoop/conf LOG_DIRS=/var/log/hadoop-yarn/containers/application_1529349239295_0005/container_1529349239295_0005_01_000001 NM_AUX_SERVICE_mapreduce_shuffle=AAA0+gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA= HADOOP_YARN_HOME=/usr/lib/hadoop-yarn YARN_PID_DIR=/var/run/hadoop-yarn HADOOP_HOME=/usr/lib/hadoop SHLVL=5 YARN_ROOT_LOGGER=INFO,RFA JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 TERM=xterm APP_SUBMIT_TIME_ENV=1529350284776 NM_HOST=localhost YARN_LOGFILE=yarn-yarn-nodemanager-laptop-name.log HADOOP_USER_NAME=apex SUDO_GID=1000 HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec YARN_LOG_DIR=/var/log/hadoop-yarn HADOOP_HDFS_HOME=/usr/lib/hadoop-hdfs HADOOP_MAPRED_HOME=/usr/lib/hadoop-mapreduce SUDO_UID=1000 HADOOP_COMMON_HOME=/usr/lib/hadoop _=/usr/lib/jvm/java-8-openjdk-amd64/bin/java APPLICATION_WEB_PROXY_BASE=/proxy/application_1529349239295_0005 NM_HTTP_PORT=8042 NM_PORT=40317 USER=apex CLASSPATH=./*:/etc/hadoop/conf:/usr/lib/hadoop/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-mapreduce/*:/usr/lib/hadoop-mapreduce/lib/*:/usr/lib/hadoop-yarn/*:/usr/lib/hadoop-yarn/lib/* SUDO_COMMAND=/etc/init.d/hadoop-yarn-nodemanager restart HADOOP_TOKEN_FILE_LOCATION=/var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000001/container_tokens HOSTNAME=laptop-name YARN_NICENESS=0 HOME=/home/ CONTAINER_ID=container_1529349239295_0005_01_000001 MALLOC_ARENA_MAX=4 2018-06-18 19:31:27,159 INFO com.datatorrent.stram.StreamingAppMaster: Initializing Application Master. 2018-06-18 19:31:27,231 INFO com.datatorrent.stram.StreamingAppMasterService: Application master, appId=5, clustertimestamp=1529349239295, attemptId=1 2018-06-18 19:31:28,385 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000001/tmp/chkp7632375253200875181 as the basepath for checkpointing. 2018-06-18 19:31:28,870 INFO org.apache.hadoop.yarn.client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 2018-06-18 19:31:29,105 INFO com.datatorrent.stram.FSRecoveryHandler: Creating hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005/recovery/log 2018-06-18 19:31:29,159 INFO com.datatorrent.stram.StreamingAppMasterService: Starting application with 3 operators in 3 containers 2018-06-18 19:31:29,210 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Upper bound of the thread pool size is 500 2018-06-18 19:31:29,212 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0 2018-06-18 19:31:29,301 WARN org.apache.hadoop.conf.Configuration: org.apache.hadoop.hdfs.client.HdfsDataInputStream@1e63d216:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval; Ignoring. 2018-06-18 19:31:29,303 WARN org.apache.hadoop.conf.Configuration: org.apache.hadoop.hdfs.client.HdfsDataInputStream@1e63d216:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts; Ignoring. 2018-06-18 19:31:29,310 INFO org.apache.hadoop.yarn.client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8030 2018-06-18 19:31:29,334 INFO com.datatorrent.stram.StreamingContainerParent: Config: Configuration: core-default.xml, core-site.xml, yarn-default.xml, yarn-site.xml, mapred-default.xml, mapred-site.xml, hdfs-default.xml, hdfs-site.xml 2018-06-18 19:31:29,334 INFO com.datatorrent.stram.StreamingContainerParent: Listener thread count 30 2018-06-18 19:31:29,340 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue class java.util.concurrent.LinkedBlockingQueue 2018-06-18 19:31:29,347 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 42381 2018-06-18 19:31:29,359 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2018-06-18 19:31:29,360 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 42381: starting 2018-06-18 19:31:29,419 INFO com.datatorrent.stram.StreamingContainerParent: Container callback server listening at laptop-name/127.0.0.1:42381 2018-06-18 19:31:29,449 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog 2018-06-18 19:31:29,554 INFO org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. 2018-06-18 19:31:29,564 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.stram is not defined 2018-06-18 19:31:29,574 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2018-06-18 19:31:29,577 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context stram 2018-06-18 19:31:29,577 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs 2018-06-18 19:31:29,577 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static 2018-06-18 19:31:29,586 INFO org.apache.hadoop.http.HttpServer2: adding path spec: /stram/* 2018-06-18 19:31:29,586 INFO org.apache.hadoop.http.HttpServer2: adding path spec: /ws/* 2018-06-18 19:31:29,932 INFO org.apache.hadoop.yarn.webapp.WebApps: Registered webapp guice modules 2018-06-18 19:31:29,933 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 39819 2018-06-18 19:31:35,855 WARN com.datatorrent.stram.webapp.TypeGraphFactory: The size of precomputed type graph is 5709 KB 2018-06-18 19:31:36,183 INFO org.apache.hadoop.yarn.webapp.WebApps: Web app stram started at 39819 2018-06-18 19:31:36,183 INFO com.datatorrent.stram.StreamingAppMasterService: Started web service at port: 39819 2018-06-18 19:31:36,184 INFO com.datatorrent.stram.StreamingAppMasterService: Setting tracking URL to: localhost:39819 2018-06-18 19:31:36,193 INFO com.datatorrent.stram.StreamingAppMasterService: Starting ApplicationMaster 2018-06-18 19:31:36,193 INFO com.datatorrent.stram.StreamingAppMasterService: number of tokens: 1 2018-06-18 19:31:36,246 INFO com.datatorrent.stram.StreamingAppMasterService: Max mem 8192m, Min mem 1024m, Max vcores 32 and Min vcores 1 capabililty of resources in this cluster 2018-06-18 19:31:36,246 INFO com.datatorrent.stram.StreamingAppMasterService: Blacklist removal time in millis = 3600000, max consecutive node failure count = 2147483647 2018-06-18 19:31:36,247 INFO org.apache.hadoop.yarn.client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 2018-06-18 19:31:37,300 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:31:37,300 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:31:37,308 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=1,name=SequenceGenerator,state=PENDING_DEPLOY] 2018-06-18 19:31:37,309 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:31:37,309 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=3,name=FileOutput,state=PENDING_DEPLOY] 2018-06-18 19:31:37,309 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:31:38,381 INFO org.apache.hadoop.yarn.client.api.impl.AMRMClientImpl: Received new token for : localhost:40317 2018-06-18 19:31:38,387 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000002, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority0 2018-06-18 19:31:38,412 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000002 2018-06-18 19:31:38,428 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:31:38,998 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:31:38,998 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000002 2018-06-18 19:31:38,998 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000002 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:31:39,010 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000002 2018-06-18 19:31:39,014 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:31:40,012 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000003, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority1 2018-06-18 19:31:40,013 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000003 2018-06-18 19:31:40,013 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:31:40,063 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:31:40,063 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000003 2018-06-18 19:31:40,067 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000003 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:31:40,072 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000003 2018-06-18 19:31:40,072 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:31:41,077 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000004, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory1024, priority2 2018-06-18 19:31:41,080 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000004 2018-06-18 19:31:41,080 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:31:41,162 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:31:41,163 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx805306368 for container container_1529349239295_0005_01_000004 2018-06-18 19:31:41,163 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx805306368 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000004 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:31:41,172 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000004 2018-06-18 19:31:41,172 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:31:41,971 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000002] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000002),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:31:42,989 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000002 buffer server: laptop-name:39089 2018-06-18 19:31:43,551 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000003] Entering heartbeat loop.. context: PTContainer[id=1(container_1529349239295_0005_01_000003),state=ALLOCATED,operators=[PTOperator[id=1,name=SequenceGenerator,state=PENDING_DEPLOY]]] 2018-06-18 19:31:44,343 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000004] Entering heartbeat loop.. context: PTContainer[id=3(container_1529349239295_0005_01_000004),state=ALLOCATED,operators=[PTOperator[id=3,name=FileOutput,state=PENDING_DEPLOY]]] 2018-06-18 19:31:44,575 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000003 buffer server: laptop-name:41209 2018-06-18 19:31:45,365 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000004 buffer server: laptop-name:43545 2018-06-18 19:31:45,982 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000002),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:31:47,675 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 1 2018-06-18 19:31:47,676 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:31:48,179 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000002 2018-06-18 19:31:48,181 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000002 2018-06-18 19:31:48,183 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:31:49,193 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000002, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:31:49,194 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000002@localhost:40317 2018-06-18 19:31:49,195 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:31:50,255 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:31:50,256 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:31:51,267 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000005, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority3 2018-06-18 19:31:51,267 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000002 2018-06-18 19:31:51,271 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000005 2018-06-18 19:31:51,271 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:31:51,361 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:31:51,361 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000005 2018-06-18 19:31:51,361 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000005 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:31:51,364 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000005 2018-06-18 19:31:51,364 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:31:53,462 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000005] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000005),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:31:54,483 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000005 buffer server: laptop-name:45867 2018-06-18 19:31:54,759 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000005),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:31:56,704 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 2 2018-06-18 19:31:56,704 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:31:57,373 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000005 2018-06-18 19:31:57,375 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000005 2018-06-18 19:31:57,377 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:31:58,380 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000005, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:31:58,381 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000005@localhost:40317 2018-06-18 19:31:58,381 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:31:59,420 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:31:59,420 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:32:00,430 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000006, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority4 2018-06-18 19:32:00,431 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000005 2018-06-18 19:32:00,433 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000006 2018-06-18 19:32:00,433 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:32:00,520 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:32:00,520 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000006 2018-06-18 19:32:00,521 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000006 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:32:00,522 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000006 2018-06-18 19:32:00,522 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:32:02,617 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000006] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000006),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:32:03,637 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000006 buffer server: laptop-name:40735 2018-06-18 19:32:03,893 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000006),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:32:05,843 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 3 2018-06-18 19:32:05,844 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:32:06,533 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000006 2018-06-18 19:32:06,536 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000006 2018-06-18 19:32:06,538 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:32:07,543 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000006, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:32:07,543 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000006@localhost:40317 2018-06-18 19:32:07,544 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:32:08,606 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:32:08,606 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:32:09,617 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000007, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority5 2018-06-18 19:32:09,617 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000006 2018-06-18 19:32:09,620 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000007 2018-06-18 19:32:09,620 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:32:09,715 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:32:09,716 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000007 2018-06-18 19:32:09,716 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000007 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:32:09,718 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000007 2018-06-18 19:32:09,718 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:32:11,882 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000007] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000007),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:32:12,902 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000007 buffer server: laptop-name:46723 2018-06-18 19:32:13,140 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000007),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:32:15,073 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 4 2018-06-18 19:32:15,074 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:32:15,731 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000007 2018-06-18 19:32:15,733 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000007 2018-06-18 19:32:15,735 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:32:16,738 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000007, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:32:16,739 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000007@localhost:40317 2018-06-18 19:32:16,739 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:32:17,791 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:32:17,792 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:32:18,805 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000008, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority6 2018-06-18 19:32:18,806 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000007 2018-06-18 19:32:18,807 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000008 2018-06-18 19:32:18,808 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:32:18,896 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:32:18,897 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000008 2018-06-18 19:32:18,897 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000008 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:32:18,898 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000008 2018-06-18 19:32:18,898 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:32:20,941 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000008] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000008),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:32:21,961 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000008 buffer server: laptop-name:39725 2018-06-18 19:32:22,183 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000008),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:32:24,103 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 5 2018-06-18 19:32:24,103 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:32:24,907 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000008 2018-06-18 19:32:24,907 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000008 2018-06-18 19:32:24,909 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:32:25,913 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000008, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:32:25,914 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000008@localhost:40317 2018-06-18 19:32:25,914 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:32:26,958 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:32:26,958 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:32:27,968 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000009, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority7 2018-06-18 19:32:27,969 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000008 2018-06-18 19:32:27,970 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000009 2018-06-18 19:32:27,971 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:32:28,033 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:32:28,034 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000009 2018-06-18 19:32:28,034 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000009 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:32:28,034 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000009 2018-06-18 19:32:28,035 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:32:30,226 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000009] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000009),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:32:31,251 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000009 buffer server: laptop-name:38157 2018-06-18 19:32:31,627 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000009),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:32:33,523 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 6 2018-06-18 19:32:33,524 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:32:34,047 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000009 2018-06-18 19:32:34,048 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000009 2018-06-18 19:32:34,050 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:32:35,055 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000009, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:32:35,055 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000009@localhost:40317 2018-06-18 19:32:35,056 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:32:36,097 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:32:36,097 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:32:37,110 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000010, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority8 2018-06-18 19:32:37,110 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000009 2018-06-18 19:32:37,112 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000010 2018-06-18 19:32:37,112 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:32:37,171 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:32:37,171 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000010 2018-06-18 19:32:37,171 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000010 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:32:37,171 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000010 2018-06-18 19:32:37,172 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:32:39,156 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000010] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000010),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:32:40,176 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000010 buffer server: laptop-name:34205 2018-06-18 19:32:40,408 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000010),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:32:42,335 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 7 2018-06-18 19:32:42,335 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:32:43,182 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000010 2018-06-18 19:32:43,182 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000010 2018-06-18 19:32:43,185 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:32:44,189 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000010, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:32:44,190 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000010@localhost:40317 2018-06-18 19:32:44,190 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:32:45,230 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:32:45,231 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:32:46,241 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000011, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority9 2018-06-18 19:32:46,242 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000010 2018-06-18 19:32:46,244 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000011 2018-06-18 19:32:46,244 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:32:46,326 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:32:46,326 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000011 2018-06-18 19:32:46,327 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000011 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:32:46,327 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000011 2018-06-18 19:32:46,328 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:32:48,663 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000011] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000011),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:32:49,684 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000011 buffer server: laptop-name:39771 2018-06-18 19:32:49,928 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000011),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:32:51,858 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 8 2018-06-18 19:32:51,858 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:32:52,338 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000011 2018-06-18 19:32:52,338 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000011 2018-06-18 19:32:52,341 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:32:53,344 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000011, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:32:53,345 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000011@localhost:40317 2018-06-18 19:32:53,345 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:32:54,392 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:32:54,392 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:32:55,401 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000012, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority10 2018-06-18 19:32:55,402 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000011 2018-06-18 19:32:55,403 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000012 2018-06-18 19:32:55,404 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:32:55,470 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:32:55,470 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000012 2018-06-18 19:32:55,471 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000012 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:32:55,471 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000012 2018-06-18 19:32:55,471 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:32:57,537 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000012] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000012),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:32:58,557 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000012 buffer server: laptop-name:46031 2018-06-18 19:32:58,838 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000012),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:33:00,763 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 9 2018-06-18 19:33:00,763 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:33:01,483 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000012 2018-06-18 19:33:01,483 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000012 2018-06-18 19:33:01,488 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:33:02,496 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000012, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:33:02,497 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000012@localhost:40317 2018-06-18 19:33:02,497 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:33:03,541 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:33:03,542 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:33:04,553 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000013, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority11 2018-06-18 19:33:04,554 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000012 2018-06-18 19:33:04,556 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000013 2018-06-18 19:33:04,556 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:33:04,630 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:33:04,630 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000013 2018-06-18 19:33:04,631 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000013 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:33:04,631 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000013 2018-06-18 19:33:04,631 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:33:06,822 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000013] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000013),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:33:07,842 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000013 buffer server: laptop-name:36913 2018-06-18 19:33:08,107 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000013),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:33:10,030 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 10 2018-06-18 19:33:10,030 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:33:10,641 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000013 2018-06-18 19:33:10,642 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000013 2018-06-18 19:33:10,643 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:33:11,648 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000013, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:33:11,648 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000013@localhost:40317 2018-06-18 19:33:11,649 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:33:12,689 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:33:12,689 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:33:13,699 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000014, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority12 2018-06-18 19:33:13,700 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000013 2018-06-18 19:33:13,702 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000014 2018-06-18 19:33:13,702 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:33:13,788 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:33:13,788 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000014 2018-06-18 19:33:13,789 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000014 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:33:13,789 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000014 2018-06-18 19:33:13,789 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:33:15,942 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000014] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000014),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:33:16,963 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000014 buffer server: laptop-name:36729 2018-06-18 19:33:17,265 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000014),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:33:19,179 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 11 2018-06-18 19:33:19,179 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:33:19,800 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000014 2018-06-18 19:33:19,801 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000014 2018-06-18 19:33:19,803 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:33:20,807 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000014, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:33:20,807 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000014@localhost:40317 2018-06-18 19:33:20,808 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:33:21,845 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:33:21,845 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:33:22,852 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000015, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority13 2018-06-18 19:33:22,852 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000014 2018-06-18 19:33:22,854 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000015 2018-06-18 19:33:22,854 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:33:22,921 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:33:22,921 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000015 2018-06-18 19:33:22,922 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000015 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:33:22,922 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000015 2018-06-18 19:33:22,923 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:33:25,073 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000015] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000015),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:33:26,094 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000015 buffer server: laptop-name:34849 2018-06-18 19:33:26,384 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000015),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:33:28,300 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 12 2018-06-18 19:33:28,300 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:33:28,935 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000015 2018-06-18 19:33:28,935 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000015 2018-06-18 19:33:28,937 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:33:29,942 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000015, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:33:29,943 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000015@localhost:40317 2018-06-18 19:33:29,943 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:33:30,991 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:33:30,991 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:33:32,000 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000016, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority14 2018-06-18 19:33:32,000 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000015 2018-06-18 19:33:32,002 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000016 2018-06-18 19:33:32,002 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:33:32,061 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:33:32,061 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000016 2018-06-18 19:33:32,062 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000016 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:33:32,062 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000016 2018-06-18 19:33:32,062 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:33:34,128 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000016] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000016),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:33:35,149 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000016 buffer server: laptop-name:40703 2018-06-18 19:33:35,414 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000016),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:33:37,342 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 13 2018-06-18 19:33:37,342 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:33:38,071 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000016 2018-06-18 19:33:38,071 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000016 2018-06-18 19:33:38,073 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:33:39,077 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000016, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:33:39,077 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000016@localhost:40317 2018-06-18 19:33:39,078 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:33:40,123 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:33:40,124 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:33:41,134 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000017, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority15 2018-06-18 19:33:41,134 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000016 2018-06-18 19:33:41,136 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000017 2018-06-18 19:33:41,137 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:33:41,199 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:33:41,200 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000017 2018-06-18 19:33:41,200 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000017 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:33:41,200 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000017 2018-06-18 19:33:41,200 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:33:43,311 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000017] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000017),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:33:44,334 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000017 buffer server: laptop-name:42397 2018-06-18 19:33:44,559 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000017),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:33:46,492 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 14 2018-06-18 19:33:46,492 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:33:47,212 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000017 2018-06-18 19:33:47,213 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000017 2018-06-18 19:33:47,215 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:33:48,221 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000017, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:33:48,221 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000017@localhost:40317 2018-06-18 19:33:48,225 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:33:49,275 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:33:49,275 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:33:50,282 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000018, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority16 2018-06-18 19:33:50,282 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000017 2018-06-18 19:33:50,283 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000018 2018-06-18 19:33:50,284 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:33:50,310 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:33:50,310 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000018 2018-06-18 19:33:50,311 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000018 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:33:50,311 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000018 2018-06-18 19:33:50,311 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:33:52,340 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000018] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000018),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:33:53,360 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000018 buffer server: laptop-name:33185 2018-06-18 19:33:53,611 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000018),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:33:55,536 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 15 2018-06-18 19:33:55,536 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:33:56,322 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000018 2018-06-18 19:33:56,322 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000018 2018-06-18 19:33:56,323 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:33:57,327 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000018, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:33:57,327 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000018@localhost:40317 2018-06-18 19:33:57,327 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:33:58,383 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:33:58,383 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:33:59,390 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000019, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority17 2018-06-18 19:33:59,391 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000018 2018-06-18 19:33:59,393 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000019 2018-06-18 19:33:59,393 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:33:59,450 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:33:59,450 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000019 2018-06-18 19:33:59,451 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000019 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:33:59,451 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000019 2018-06-18 19:33:59,452 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:34:01,524 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000019] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000019),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:34:02,546 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000019 buffer server: laptop-name:32783 2018-06-18 19:34:02,831 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000019),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:34:04,773 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 16 2018-06-18 19:34:04,773 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:34:05,460 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000019 2018-06-18 19:34:05,461 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000019 2018-06-18 19:34:05,463 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:34:06,468 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000019, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:34:06,469 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000019@localhost:40317 2018-06-18 19:34:06,469 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:34:07,507 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:34:07,507 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:34:08,517 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000020, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority18 2018-06-18 19:34:08,517 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000019 2018-06-18 19:34:08,519 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000020 2018-06-18 19:34:08,519 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:34:08,572 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:34:08,572 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000020 2018-06-18 19:34:08,572 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000020 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:34:08,572 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000020 2018-06-18 19:34:08,572 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:34:10,658 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000020] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000020),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:34:11,678 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000020 buffer server: laptop-name:40743 2018-06-18 19:34:11,968 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000020),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:34:13,908 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 17 2018-06-18 19:34:13,908 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:34:14,582 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000020 2018-06-18 19:34:14,583 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000020 2018-06-18 19:34:14,584 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:34:15,589 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000020, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:34:15,589 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000020@localhost:40317 2018-06-18 19:34:15,589 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:34:16,631 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:34:16,631 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:34:17,639 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000021, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority19 2018-06-18 19:34:17,639 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000020 2018-06-18 19:34:17,639 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000021 2018-06-18 19:34:17,639 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:34:17,657 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:34:17,657 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000021 2018-06-18 19:34:17,657 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000021 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:34:17,658 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000021 2018-06-18 19:34:17,658 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:34:19,751 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000021] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000021),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:34:20,772 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000021 buffer server: laptop-name:39031 2018-06-18 19:34:21,052 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000021),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:34:22,945 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 18 2018-06-18 19:34:22,945 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:34:23,668 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000021 2018-06-18 19:34:23,669 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000021 2018-06-18 19:34:23,671 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:34:24,676 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000021, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:34:24,676 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000021@localhost:40317 2018-06-18 19:34:24,676 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:34:25,713 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:34:25,713 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:34:26,721 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000022, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority20 2018-06-18 19:34:26,721 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000021 2018-06-18 19:34:26,723 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000022 2018-06-18 19:34:26,723 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:34:26,775 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:34:26,775 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000022 2018-06-18 19:34:26,775 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000022 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:34:26,775 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000022 2018-06-18 19:34:26,776 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:34:29,021 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000022] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000022),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:34:30,042 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000022 buffer server: laptop-name:41795 2018-06-18 19:34:30,315 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000022),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:34:32,241 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 19 2018-06-18 19:34:32,241 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:34:32,785 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000022 2018-06-18 19:34:32,786 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000022 2018-06-18 19:34:32,787 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:34:33,791 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000022, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:34:33,791 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000022@localhost:40317 2018-06-18 19:34:33,791 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:34:34,823 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:34:34,823 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:34:35,831 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000023, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority21 2018-06-18 19:34:35,831 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000022 2018-06-18 19:34:35,832 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000023 2018-06-18 19:34:35,833 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:34:35,885 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:34:35,885 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000023 2018-06-18 19:34:35,886 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000023 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:34:35,886 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000023 2018-06-18 19:34:35,886 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:34:37,972 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000023] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000023),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:34:38,991 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000023 buffer server: laptop-name:40679 2018-06-18 19:34:39,286 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000023),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:34:41,200 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 20 2018-06-18 19:34:41,200 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:34:41,897 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000023 2018-06-18 19:34:41,898 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000023 2018-06-18 19:34:41,901 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:34:42,904 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000023, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:34:42,904 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000023@localhost:40317 2018-06-18 19:34:42,905 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:34:43,949 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:34:43,949 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:34:44,961 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000024, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority22 2018-06-18 19:34:44,961 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000023 2018-06-18 19:34:44,964 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000024 2018-06-18 19:34:44,964 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:34:45,026 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:34:45,026 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000024 2018-06-18 19:34:45,026 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000024 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:34:45,027 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000024 2018-06-18 19:34:45,027 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:34:47,083 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000024] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000024),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:34:48,092 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000024 buffer server: laptop-name:36075 2018-06-18 19:34:48,319 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000024),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:34:50,245 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 21 2018-06-18 19:34:50,245 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:34:51,037 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000024 2018-06-18 19:34:51,037 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000024 2018-06-18 19:34:51,039 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:34:52,044 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000024, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:34:52,044 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000024@localhost:40317 2018-06-18 19:34:52,044 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:34:53,086 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:34:53,086 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:34:54,093 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000025, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority23 2018-06-18 19:34:54,093 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000024 2018-06-18 19:34:54,093 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000025 2018-06-18 19:34:54,094 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:34:54,114 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:34:54,114 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000025 2018-06-18 19:34:54,114 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000025 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:34:54,114 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000025 2018-06-18 19:34:54,115 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:34:56,312 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000025] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000025),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:34:57,330 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000025 buffer server: laptop-name:35309 2018-06-18 19:34:57,579 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000025),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:34:59,487 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 22 2018-06-18 19:34:59,488 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:35:00,125 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000025 2018-06-18 19:35:00,126 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000025 2018-06-18 19:35:00,128 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:35:01,132 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000025, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:35:01,132 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000025@localhost:40317 2018-06-18 19:35:01,133 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:35:02,185 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:35:02,185 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:35:03,195 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000026, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority24 2018-06-18 19:35:03,195 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000025 2018-06-18 19:35:03,197 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000026 2018-06-18 19:35:03,197 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:35:03,254 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:35:03,254 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000026 2018-06-18 19:35:03,254 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000026 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:35:03,255 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000026 2018-06-18 19:35:03,255 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:35:05,282 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000026] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000026),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:35:06,300 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000026 buffer server: laptop-name:41063 2018-06-18 19:35:06,534 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000026),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:35:08,457 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 23 2018-06-18 19:35:08,457 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:35:09,271 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000026 2018-06-18 19:35:09,272 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000026 2018-06-18 19:35:09,273 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:35:10,278 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000026, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:35:10,278 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000026@localhost:40317 2018-06-18 19:35:10,278 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:35:11,319 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:35:11,320 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:35:12,329 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000027, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority25 2018-06-18 19:35:12,329 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000026 2018-06-18 19:35:12,330 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000027 2018-06-18 19:35:12,331 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:35:12,387 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:35:12,387 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000027 2018-06-18 19:35:12,387 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000027 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:35:12,388 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000027 2018-06-18 19:35:12,388 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:35:14,466 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000027] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000027),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:35:15,485 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000027 buffer server: laptop-name:43109 2018-06-18 19:35:15,704 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000027),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:35:17,624 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 24 2018-06-18 19:35:17,624 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:35:18,399 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000027 2018-06-18 19:35:18,400 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000027 2018-06-18 19:35:18,401 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:35:19,405 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000027, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:35:19,406 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000027@localhost:40317 2018-06-18 19:35:19,406 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:35:20,445 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:35:20,445 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:35:21,454 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000028, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority26 2018-06-18 19:35:21,454 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000027 2018-06-18 19:35:21,456 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000028 2018-06-18 19:35:21,456 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:35:21,516 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:35:21,516 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000028 2018-06-18 19:35:21,517 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000028 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:35:21,518 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000028 2018-06-18 19:35:21,518 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:35:23,609 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000028] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000028),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:35:24,628 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000028 buffer server: laptop-name:41533 2018-06-18 19:35:24,939 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000028),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:35:26,847 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 25 2018-06-18 19:35:26,848 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:35:27,527 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000028 2018-06-18 19:35:27,528 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000028 2018-06-18 19:35:27,529 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:35:28,534 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000028, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:35:28,534 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000028@localhost:40317 2018-06-18 19:35:28,534 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:35:29,571 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:35:29,571 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:35:30,581 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000029, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority27 2018-06-18 19:35:30,581 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000028 2018-06-18 19:35:30,582 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000029 2018-06-18 19:35:30,583 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:35:30,639 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:35:30,640 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000029 2018-06-18 19:35:30,640 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000029 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:35:30,642 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000029 2018-06-18 19:35:30,642 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:35:32,634 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000029] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000029),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:35:33,654 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000029 buffer server: laptop-name:45093 2018-06-18 19:35:33,911 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000029),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:35:35,845 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 26 2018-06-18 19:35:35,845 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:35:36,649 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000029 2018-06-18 19:35:36,649 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000029 2018-06-18 19:35:36,651 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:35:37,655 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000029, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:35:37,655 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000029@localhost:40317 2018-06-18 19:35:37,655 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:35:38,700 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:35:38,700 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:35:39,710 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000030, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority28 2018-06-18 19:35:39,710 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000029 2018-06-18 19:35:39,712 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000030 2018-06-18 19:35:39,712 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:35:39,760 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:35:39,761 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000030 2018-06-18 19:35:39,761 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000030 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:35:39,761 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000030 2018-06-18 19:35:39,761 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:35:41,832 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000030] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000030),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:35:42,852 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000030 buffer server: laptop-name:37837 2018-06-18 19:35:43,085 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000030),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:35:44,999 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 27 2018-06-18 19:35:44,999 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:35:45,780 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000030 2018-06-18 19:35:45,781 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000030 2018-06-18 19:35:45,782 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:35:46,786 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000030, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:35:46,786 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000030@localhost:40317 2018-06-18 19:35:46,786 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:35:47,827 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:35:47,827 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:35:48,837 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000031, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority29 2018-06-18 19:35:48,837 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000030 2018-06-18 19:35:48,838 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000031 2018-06-18 19:35:48,839 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:35:48,892 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:35:48,892 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000031 2018-06-18 19:35:48,893 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000031 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:35:48,893 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000031 2018-06-18 19:35:48,893 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:35:50,958 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000031] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000031),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:35:51,966 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000031 buffer server: laptop-name:36333 2018-06-18 19:35:52,194 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000031),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:35:54,117 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 28 2018-06-18 19:35:54,117 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:35:54,903 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000031 2018-06-18 19:35:54,903 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000031 2018-06-18 19:35:54,905 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:35:55,910 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000031, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:35:55,911 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000031@localhost:40317 2018-06-18 19:35:55,911 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:35:56,955 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:35:56,955 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:35:57,967 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000032, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority30 2018-06-18 19:35:57,967 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000031 2018-06-18 19:35:57,969 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000032 2018-06-18 19:35:57,969 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:35:58,024 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:35:58,024 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000032 2018-06-18 19:35:58,025 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000032 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:35:58,025 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000032 2018-06-18 19:35:58,025 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:36:00,115 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000032] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000032),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:36:01,136 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000032 buffer server: laptop-name:44103 2018-06-18 19:36:01,394 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000032),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:36:03,304 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 29 2018-06-18 19:36:03,305 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:36:04,036 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000032 2018-06-18 19:36:04,036 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000032 2018-06-18 19:36:04,038 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:36:05,044 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000032, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:36:05,044 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000032@localhost:40317 2018-06-18 19:36:05,044 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:36:06,082 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:36:06,082 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:36:07,092 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000033, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority31 2018-06-18 19:36:07,092 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000032 2018-06-18 19:36:07,094 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000033 2018-06-18 19:36:07,094 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:36:07,150 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:36:07,150 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000033 2018-06-18 19:36:07,151 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000033 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:36:07,151 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000033 2018-06-18 19:36:07,151 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:36:09,189 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000033] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000033),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:36:10,209 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000033 buffer server: laptop-name:37053 2018-06-18 19:36:10,433 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000033),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:36:12,360 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 30 2018-06-18 19:36:12,360 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:36:13,160 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000033 2018-06-18 19:36:13,160 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000033 2018-06-18 19:36:13,162 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:36:14,166 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000033, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:36:14,167 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000033@localhost:40317 2018-06-18 19:36:14,167 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:36:15,206 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:36:15,206 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:36:16,215 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000034, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority32 2018-06-18 19:36:16,215 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000033 2018-06-18 19:36:16,217 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000034 2018-06-18 19:36:16,217 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:36:16,264 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:36:16,264 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000034 2018-06-18 19:36:16,264 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000034 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:36:16,264 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000034 2018-06-18 19:36:16,264 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:36:18,298 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000034] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000034),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:36:19,317 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000034 buffer server: laptop-name:36259 2018-06-18 19:36:19,521 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000034),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:36:21,447 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 31 2018-06-18 19:36:21,447 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:36:22,274 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000034 2018-06-18 19:36:22,275 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000034 2018-06-18 19:36:22,276 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:36:23,280 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000034, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:36:23,280 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000034@localhost:40317 2018-06-18 19:36:23,280 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:36:24,323 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:36:24,323 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:36:25,333 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000035, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority33 2018-06-18 19:36:25,333 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000034 2018-06-18 19:36:25,335 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000035 2018-06-18 19:36:25,335 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:36:25,394 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:36:25,394 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000035 2018-06-18 19:36:25,395 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000035 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:36:25,395 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000035 2018-06-18 19:36:25,396 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:36:27,477 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000035] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000035),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:36:28,496 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000035 buffer server: laptop-name:46709 2018-06-18 19:36:28,760 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000035),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:36:30,694 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 32 2018-06-18 19:36:30,695 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:36:31,408 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000035 2018-06-18 19:36:31,409 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000035 2018-06-18 19:36:31,411 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:36:32,416 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000035, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:36:32,416 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000035@localhost:40317 2018-06-18 19:36:32,416 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:36:33,461 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:36:33,461 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:36:34,471 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000036, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority34 2018-06-18 19:36:34,471 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000035 2018-06-18 19:36:34,473 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000036 2018-06-18 19:36:34,473 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:36:34,519 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:36:34,519 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000036 2018-06-18 19:36:34,520 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000036 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:36:34,520 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000036 2018-06-18 19:36:34,520 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:36:36,576 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000036] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000036),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:36:37,595 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000036 buffer server: laptop-name:35763 2018-06-18 19:36:37,872 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000036),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:36:39,815 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 33 2018-06-18 19:36:39,815 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:36:40,529 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000036 2018-06-18 19:36:40,530 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000036 2018-06-18 19:36:40,532 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:36:41,537 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000036, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:36:41,537 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000036@localhost:40317 2018-06-18 19:36:41,537 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:36:42,574 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:36:42,574 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:36:43,583 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000037, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority35 2018-06-18 19:36:43,583 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000036 2018-06-18 19:36:43,585 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000037 2018-06-18 19:36:43,585 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:36:43,628 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:36:43,628 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000037 2018-06-18 19:36:43,628 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000037 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:36:43,628 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000037 2018-06-18 19:36:43,629 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:36:45,689 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000037] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000037),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:36:46,708 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000037 buffer server: laptop-name:39841 2018-06-18 19:36:46,932 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000037),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:36:48,876 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 34 2018-06-18 19:36:48,876 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:36:49,641 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000037 2018-06-18 19:36:49,642 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000037 2018-06-18 19:36:49,644 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:36:50,650 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000037, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:36:50,650 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000037@localhost:40317 2018-06-18 19:36:50,651 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:36:51,690 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:36:51,690 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:36:52,701 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000038, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority36 2018-06-18 19:36:52,701 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000037 2018-06-18 19:36:52,702 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000038 2018-06-18 19:36:52,702 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:36:52,757 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:36:52,757 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000038 2018-06-18 19:36:52,757 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000038 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:36:52,757 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000038 2018-06-18 19:36:52,757 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:36:54,821 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000038] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000038),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:36:55,840 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000038 buffer server: laptop-name:35145 2018-06-18 19:36:56,071 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000038),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:36:57,987 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 35 2018-06-18 19:36:57,987 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:36:58,774 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000038 2018-06-18 19:36:58,774 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000038 2018-06-18 19:36:58,775 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:36:59,778 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000038, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:36:59,778 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000038@localhost:40317 2018-06-18 19:36:59,778 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:37:00,814 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:37:00,814 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:37:01,821 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000039, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority37 2018-06-18 19:37:01,821 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000038 2018-06-18 19:37:01,823 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000039 2018-06-18 19:37:01,823 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:37:01,874 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:37:01,875 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000039 2018-06-18 19:37:01,875 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000039 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:37:01,875 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000039 2018-06-18 19:37:01,875 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:37:03,883 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000039] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000039),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:37:04,902 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000039 buffer server: laptop-name:38997 2018-06-18 19:37:05,121 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000039),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:37:07,034 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 36 2018-06-18 19:37:07,035 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:37:07,883 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000039 2018-06-18 19:37:07,884 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000039 2018-06-18 19:37:07,886 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:37:08,890 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000039, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:37:08,890 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000039@localhost:40317 2018-06-18 19:37:08,891 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:37:09,929 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:37:09,929 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:37:10,938 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000040, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority38 2018-06-18 19:37:10,938 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000039 2018-06-18 19:37:10,940 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000040 2018-06-18 19:37:10,940 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:37:10,978 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:37:10,979 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000040 2018-06-18 19:37:10,979 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000040 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:37:10,979 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000040 2018-06-18 19:37:10,979 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:37:13,077 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000040] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000040),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:37:14,096 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000040 buffer server: laptop-name:43405 2018-06-18 19:37:14,315 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000040),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:37:16,238 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 37 2018-06-18 19:37:16,238 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:37:16,988 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000040 2018-06-18 19:37:16,989 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000040 2018-06-18 19:37:16,990 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:37:17,994 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000040, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:37:17,994 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000040@localhost:40317 2018-06-18 19:37:17,995 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:37:19,032 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:37:19,032 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:37:20,041 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000041, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority39 2018-06-18 19:37:20,042 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000040 2018-06-18 19:37:20,043 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000041 2018-06-18 19:37:20,044 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:37:20,098 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:37:20,098 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000041 2018-06-18 19:37:20,098 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000041 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:37:20,099 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000041 2018-06-18 19:37:20,099 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:37:22,162 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000041] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000041),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:37:23,182 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000041 buffer server: laptop-name:38133 2018-06-18 19:37:23,398 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000041),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:37:25,335 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 38 2018-06-18 19:37:25,335 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:37:26,127 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000041 2018-06-18 19:37:26,128 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000041 2018-06-18 19:37:26,130 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:37:27,143 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000041, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:37:27,143 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000041@localhost:40317 2018-06-18 19:37:27,143 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:37:28,182 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:37:28,183 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:37:29,192 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000042, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority40 2018-06-18 19:37:29,193 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000041 2018-06-18 19:37:29,194 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000042 2018-06-18 19:37:29,194 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:37:29,248 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:37:29,248 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000042 2018-06-18 19:37:29,249 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000042 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:37:29,249 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000042 2018-06-18 19:37:29,249 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:37:31,255 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000042] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000042),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:37:32,274 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000042 buffer server: laptop-name:45203 2018-06-18 19:37:32,517 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000042),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:37:34,451 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 39 2018-06-18 19:37:34,452 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:37:35,259 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000042 2018-06-18 19:37:35,260 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000042 2018-06-18 19:37:35,261 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:37:36,265 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000042, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:37:36,265 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000042@localhost:40317 2018-06-18 19:37:36,265 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:37:37,299 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:37:37,299 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:37:38,306 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000043, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority41 2018-06-18 19:37:38,306 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000042 2018-06-18 19:37:38,306 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000043 2018-06-18 19:37:38,306 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:37:38,321 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:37:38,321 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000043 2018-06-18 19:37:38,321 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000043 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:37:38,321 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000043 2018-06-18 19:37:38,321 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:37:40,378 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000043] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000043),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:37:41,397 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000043 buffer server: laptop-name:37551 2018-06-18 19:37:41,677 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000043),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:37:43,617 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 40 2018-06-18 19:37:43,617 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:37:44,332 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000043 2018-06-18 19:37:44,333 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000043 2018-06-18 19:37:44,334 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:37:45,339 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000043, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:37:45,339 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000043@localhost:40317 2018-06-18 19:37:45,339 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:37:46,375 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:37:46,375 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:37:47,383 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000044, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority42 2018-06-18 19:37:47,384 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000043 2018-06-18 19:37:47,385 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000044 2018-06-18 19:37:47,385 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:37:47,438 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:37:47,438 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000044 2018-06-18 19:37:47,439 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000044 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:37:47,439 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000044 2018-06-18 19:37:47,439 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:37:49,541 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000044] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000044),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:37:50,560 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000044 buffer server: laptop-name:43271 2018-06-18 19:37:50,821 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000044),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:37:52,768 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 41 2018-06-18 19:37:52,769 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:37:53,451 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000044 2018-06-18 19:37:53,452 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000044 2018-06-18 19:37:53,453 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:37:54,458 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000044, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:37:54,458 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000044@localhost:40317 2018-06-18 19:37:54,458 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:37:55,497 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:37:55,497 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:37:56,505 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000045, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority43 2018-06-18 19:37:56,506 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000044 2018-06-18 19:37:56,507 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000045 2018-06-18 19:37:56,507 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:37:56,543 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:37:56,543 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000045 2018-06-18 19:37:56,543 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000045 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:37:56,543 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000045 2018-06-18 19:37:56,543 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:37:58,720 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000045] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000045),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:37:59,739 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000045 buffer server: laptop-name:39349 2018-06-18 19:37:59,971 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000045),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:38:01,897 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 42 2018-06-18 19:38:01,898 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:38:02,554 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000045 2018-06-18 19:38:02,554 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000045 2018-06-18 19:38:02,557 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:38:03,562 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000045, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:38:03,562 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000045@localhost:40317 2018-06-18 19:38:03,562 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:38:04,598 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:38:04,598 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:38:05,607 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000046, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority44 2018-06-18 19:38:05,607 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000045 2018-06-18 19:38:05,609 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000046 2018-06-18 19:38:05,609 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:38:05,654 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:38:05,654 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000046 2018-06-18 19:38:05,654 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000046 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:38:05,655 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000046 2018-06-18 19:38:05,655 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:38:07,692 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000046] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000046),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:38:08,712 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000046 buffer server: laptop-name:37299 2018-06-18 19:38:08,927 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000046),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:38:10,839 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 43 2018-06-18 19:38:10,839 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:38:11,666 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000046 2018-06-18 19:38:11,667 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000046 2018-06-18 19:38:11,668 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:38:12,012 INFO org.apache.hadoop.ipc.Server: Socket Reader #1 for port 42381: readAndProcess from client 127.0.0.1 threw exception [java.io.IOException: Connection reset by peer] java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.read0(Native Method) at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39) at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223) at sun.nio.ch.IOUtil.read(IOUtil.java:197) at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380) at org.apache.hadoop.ipc.Server.channelRead(Server.java:2604) at org.apache.hadoop.ipc.Server.access$2800(Server.java:136) at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1481) at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:771) at org.apache.hadoop.ipc.Server$Listener$Reader.doRunLoop(Server.java:637) at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:608) 2018-06-18 19:38:12,673 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000046, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:38:12,673 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000046@localhost:40317 2018-06-18 19:38:12,673 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:38:13,709 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:38:13,709 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:38:14,717 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000047, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority45 2018-06-18 19:38:14,717 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000046 2018-06-18 19:38:14,718 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000047 2018-06-18 19:38:14,718 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:38:14,760 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:38:14,760 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000047 2018-06-18 19:38:14,760 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000047 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:38:14,760 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000047 2018-06-18 19:38:14,761 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:38:16,823 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000047] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000047),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:38:17,841 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000047 buffer server: laptop-name:38149 2018-06-18 19:38:18,063 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000047),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:38:19,992 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 44 2018-06-18 19:38:19,992 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:38:20,770 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000047 2018-06-18 19:38:20,771 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000047 2018-06-18 19:38:20,772 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:38:21,776 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000047, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:38:21,776 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000047@localhost:40317 2018-06-18 19:38:21,776 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:38:22,811 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:38:22,811 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:38:23,820 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000048, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority46 2018-06-18 19:38:23,820 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000047 2018-06-18 19:38:23,821 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000048 2018-06-18 19:38:23,821 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:38:23,872 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:38:23,872 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000048 2018-06-18 19:38:23,873 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000048 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:38:23,873 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000048 2018-06-18 19:38:23,873 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:38:25,938 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000048] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000048),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:38:26,959 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000048 buffer server: laptop-name:36133 2018-06-18 19:38:27,263 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000048),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:38:29,183 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 45 2018-06-18 19:38:29,183 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:38:29,886 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000048 2018-06-18 19:38:29,886 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000048 2018-06-18 19:38:29,888 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:38:30,893 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000048, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:38:30,893 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000048@localhost:40317 2018-06-18 19:38:30,893 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:38:31,930 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:38:31,930 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:38:32,939 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000049, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority47 2018-06-18 19:38:32,939 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000048 2018-06-18 19:38:32,941 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000049 2018-06-18 19:38:32,941 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:38:32,990 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:38:32,990 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000049 2018-06-18 19:38:32,990 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000049 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:38:32,990 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000049 2018-06-18 19:38:32,990 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:38:35,059 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000049] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000049),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:38:36,079 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000049 buffer server: laptop-name:33053 2018-06-18 19:38:36,357 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000049),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:38:38,272 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 46 2018-06-18 19:38:38,272 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:38:38,999 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000049 2018-06-18 19:38:39,000 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000049 2018-06-18 19:38:39,001 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:38:40,006 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000049, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:38:40,007 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000049@localhost:40317 2018-06-18 19:38:40,007 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:38:41,043 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:38:41,043 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:38:42,052 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000050, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority48 2018-06-18 19:38:42,052 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000049 2018-06-18 19:38:42,053 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000050 2018-06-18 19:38:42,053 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:38:42,108 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:38:42,108 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000050 2018-06-18 19:38:42,108 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000050 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:38:42,108 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000050 2018-06-18 19:38:42,108 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:38:44,173 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000050] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000050),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:38:45,193 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000050 buffer server: laptop-name:43827 2018-06-18 19:38:45,466 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000050),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:38:47,401 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 47 2018-06-18 19:38:47,402 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:38:48,118 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000050 2018-06-18 19:38:48,119 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000050 2018-06-18 19:38:48,120 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:38:49,125 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000050, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:38:49,125 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000050@localhost:40317 2018-06-18 19:38:49,125 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:38:50,161 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:38:50,161 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:38:51,170 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000051, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority49 2018-06-18 19:38:51,170 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000050 2018-06-18 19:38:51,171 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000051 2018-06-18 19:38:51,172 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:38:51,227 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:38:51,227 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000051 2018-06-18 19:38:51,227 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000051 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:38:51,228 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000051 2018-06-18 19:38:51,229 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:38:53,257 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000051] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000051),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:38:54,276 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000051 buffer server: laptop-name:37559 2018-06-18 19:38:54,538 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000051),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:38:56,479 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 48 2018-06-18 19:38:56,479 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:38:57,240 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000051 2018-06-18 19:38:57,240 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000051 2018-06-18 19:38:57,241 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:38:58,245 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000051, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:38:58,245 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000051@localhost:40317 2018-06-18 19:38:58,245 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:38:59,287 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:38:59,287 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:39:00,297 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000052, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority50 2018-06-18 19:39:00,297 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000051 2018-06-18 19:39:00,298 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000052 2018-06-18 19:39:00,298 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:39:00,331 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:39:00,331 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000052 2018-06-18 19:39:00,331 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000052 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:39:00,332 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000052 2018-06-18 19:39:00,332 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:39:02,439 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000052] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000052),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:39:03,459 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000052 buffer server: laptop-name:39383 2018-06-18 19:39:03,710 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000052),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:39:05,633 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 49 2018-06-18 19:39:05,633 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:39:06,341 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000052 2018-06-18 19:39:06,342 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000052 2018-06-18 19:39:06,344 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:39:07,349 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000052, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:39:07,349 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000052@localhost:40317 2018-06-18 19:39:07,349 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:39:08,386 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:39:08,386 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:39:09,396 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000053, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority51 2018-06-18 19:39:09,397 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000052 2018-06-18 19:39:09,398 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000053 2018-06-18 19:39:09,398 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:39:09,444 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:39:09,445 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000053 2018-06-18 19:39:09,445 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000053 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:39:09,445 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000053 2018-06-18 19:39:09,445 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:39:11,464 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000053] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000053),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:39:12,484 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000053 buffer server: laptop-name:37073 2018-06-18 19:39:12,762 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000053),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:39:14,683 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 50 2018-06-18 19:39:14,683 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:39:15,455 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000053 2018-06-18 19:39:15,456 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000053 2018-06-18 19:39:15,457 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:39:16,461 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000053, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:39:16,461 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000053@localhost:40317 2018-06-18 19:39:16,462 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:39:17,497 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:39:17,498 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:39:18,506 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000054, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority52 2018-06-18 19:39:18,506 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000053 2018-06-18 19:39:18,507 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000054 2018-06-18 19:39:18,507 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:39:18,547 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:39:18,548 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000054 2018-06-18 19:39:18,548 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000054 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:39:18,548 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000054 2018-06-18 19:39:18,548 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:39:20,609 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000054] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000054),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:39:21,628 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000054 buffer server: laptop-name:36435 2018-06-18 19:39:21,899 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000054),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:39:23,828 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 51 2018-06-18 19:39:23,828 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:39:24,562 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000054 2018-06-18 19:39:24,563 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000054 2018-06-18 19:39:24,564 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:39:25,571 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000054, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:39:25,571 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000054@localhost:40317 2018-06-18 19:39:25,572 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:39:26,606 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:39:26,606 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:39:27,615 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000055, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority53 2018-06-18 19:39:27,615 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000054 2018-06-18 19:39:27,616 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000055 2018-06-18 19:39:27,616 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:39:27,668 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:39:27,668 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000055 2018-06-18 19:39:27,668 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000055 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:39:27,668 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000055 2018-06-18 19:39:27,668 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:39:29,775 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000055] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000055),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:39:30,794 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000055 buffer server: laptop-name:34603 2018-06-18 19:39:31,068 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000055),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:39:32,994 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 52 2018-06-18 19:39:32,994 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:39:33,679 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000055 2018-06-18 19:39:33,681 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000055 2018-06-18 19:39:33,682 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:39:34,687 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000055, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:39:34,687 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000055@localhost:40317 2018-06-18 19:39:34,688 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:39:35,725 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:39:35,725 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:39:36,734 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000056, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority54 2018-06-18 19:39:36,735 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000055 2018-06-18 19:39:36,736 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000056 2018-06-18 19:39:36,736 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:39:36,780 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:39:36,780 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000056 2018-06-18 19:39:36,781 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000056 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:39:36,781 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000056 2018-06-18 19:39:36,781 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:39:38,809 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000056] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000056),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:39:39,819 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000056 buffer server: laptop-name:42857 2018-06-18 19:39:40,033 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000056),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:39:41,967 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 53 2018-06-18 19:39:41,967 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:39:42,792 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000056 2018-06-18 19:39:42,792 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000056 2018-06-18 19:39:42,794 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:39:43,798 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000056, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:39:43,798 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000056@localhost:40317 2018-06-18 19:39:43,799 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:39:44,838 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:39:44,838 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:39:45,847 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000057, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority55 2018-06-18 19:39:45,847 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000056 2018-06-18 19:39:45,848 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000057 2018-06-18 19:39:45,848 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:39:45,902 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:39:45,902 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000057 2018-06-18 19:39:45,903 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000057 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:39:45,903 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000057 2018-06-18 19:39:45,903 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:39:47,952 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000057] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000057),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:39:48,972 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000057 buffer server: laptop-name:42947 2018-06-18 19:39:49,257 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000057),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:39:51,172 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 54 2018-06-18 19:39:51,172 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:39:51,915 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000057 2018-06-18 19:39:51,916 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000057 2018-06-18 19:39:51,917 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:39:52,921 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000057, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:39:52,921 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000057@localhost:40317 2018-06-18 19:39:52,921 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:39:53,958 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:39:53,958 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:39:54,967 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000058, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority56 2018-06-18 19:39:54,967 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000057 2018-06-18 19:39:54,969 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000058 2018-06-18 19:39:54,969 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:39:55,020 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:39:55,020 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000058 2018-06-18 19:39:55,020 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000058 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:39:55,021 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000058 2018-06-18 19:39:55,021 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:39:57,067 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000058] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000058),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:39:58,088 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000058 buffer server: laptop-name:33033 2018-06-18 19:39:58,341 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000058),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:40:00,251 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 55 2018-06-18 19:40:00,251 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:40:01,032 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000058 2018-06-18 19:40:01,032 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000058 2018-06-18 19:40:01,034 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:40:02,039 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000058, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:40:02,039 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000058@localhost:40317 2018-06-18 19:40:02,040 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:40:03,077 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:40:03,077 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:40:04,085 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000059, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority57 2018-06-18 19:40:04,085 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000058 2018-06-18 19:40:04,086 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000059 2018-06-18 19:40:04,087 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:40:04,140 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:40:04,140 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000059 2018-06-18 19:40:04,141 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000059 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:40:04,141 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000059 2018-06-18 19:40:04,142 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:40:06,281 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000059] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000059),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:40:07,147 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:07,301 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000059 buffer server: laptop-name:38771 2018-06-18 19:40:07,587 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000059),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:40:08,150 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:09,153 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:09,511 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 56 2018-06-18 19:40:09,512 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:40:10,154 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000059 2018-06-18 19:40:10,154 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000059 2018-06-18 19:40:10,155 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:40:10,155 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:11,158 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000059, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:40:11,158 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000059@localhost:40317 2018-06-18 19:40:11,158 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:40:11,193 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:12,194 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:40:12,194 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:40:12,197 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:13,203 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000060, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority58 2018-06-18 19:40:13,204 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000059 2018-06-18 19:40:13,205 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000060 2018-06-18 19:40:13,205 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:40:13,253 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:40:13,253 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000060 2018-06-18 19:40:13,254 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000060 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:40:13,254 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000060 2018-06-18 19:40:13,254 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:40:13,255 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:14,257 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:15,258 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:15,337 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000060] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000060),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:40:16,262 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:16,357 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000060 buffer server: laptop-name:33775 2018-06-18 19:40:16,572 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000060),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:40:17,265 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:18,267 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:18,496 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 57 2018-06-18 19:40:18,497 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:40:19,268 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000060 2018-06-18 19:40:19,268 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000060 2018-06-18 19:40:19,269 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:40:19,271 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:20,273 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000060, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:40:20,273 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000060@localhost:40317 2018-06-18 19:40:20,273 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:40:20,305 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:21,306 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:40:21,306 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:40:21,309 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:22,313 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000061, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority59 2018-06-18 19:40:22,313 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000060 2018-06-18 19:40:22,314 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000061 2018-06-18 19:40:22,314 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:40:22,362 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:40:22,362 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000061 2018-06-18 19:40:22,363 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000061 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:40:22,363 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000061 2018-06-18 19:40:22,363 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:40:22,364 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:23,367 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:24,368 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:24,447 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000061] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000061),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:40:25,370 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:25,467 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000061 buffer server: laptop-name:34539 2018-06-18 19:40:25,694 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000061),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:40:26,371 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:27,372 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:27,619 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 58 2018-06-18 19:40:27,620 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:40:28,373 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000061 2018-06-18 19:40:28,373 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000061 2018-06-18 19:40:28,374 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:40:28,376 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:29,379 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000061, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:40:29,379 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000061@localhost:40317 2018-06-18 19:40:29,379 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:40:29,423 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:30,424 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:40:30,424 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:40:30,428 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:31,432 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000062, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority60 2018-06-18 19:40:31,432 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000061 2018-06-18 19:40:31,434 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000062 2018-06-18 19:40:31,434 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:40:31,481 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:40:31,482 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000062 2018-06-18 19:40:31,482 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000062 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:40:31,482 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000062 2018-06-18 19:40:31,483 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:40:31,483 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:32,487 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:33,489 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:33,581 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000062] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000062),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:40:34,490 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:34,601 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000062 buffer server: laptop-name:39207 2018-06-18 19:40:34,884 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000062),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:40:35,493 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:36,496 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:36,791 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 59 2018-06-18 19:40:36,792 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:40:37,496 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000062 2018-06-18 19:40:37,497 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000062 2018-06-18 19:40:37,498 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:40:37,501 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:38,504 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000062, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:40:38,504 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000062@localhost:40317 2018-06-18 19:40:38,504 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:40:38,540 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:39,540 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:40:39,541 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:40:39,544 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:40,550 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000063, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority61 2018-06-18 19:40:40,550 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000062 2018-06-18 19:40:40,551 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000063 2018-06-18 19:40:40,551 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:40:40,602 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:40:40,602 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000063 2018-06-18 19:40:40,603 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000063 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:40:40,603 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000063 2018-06-18 19:40:40,603 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:40:40,604 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:41,606 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:42,608 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:42,669 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000063] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000063),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:40:43,610 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:43,688 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000063 buffer server: laptop-name:42113 2018-06-18 19:40:43,926 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000063),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:40:44,613 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:45,616 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:45,854 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 60 2018-06-18 19:40:45,854 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:40:46,617 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000063 2018-06-18 19:40:46,617 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000063 2018-06-18 19:40:46,619 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:40:46,621 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:47,623 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000063, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:40:47,624 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000063@localhost:40317 2018-06-18 19:40:47,624 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:40:47,662 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:48,663 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:40:48,663 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:40:48,667 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:49,673 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000064, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority62 2018-06-18 19:40:49,673 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000063 2018-06-18 19:40:49,675 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000064 2018-06-18 19:40:49,675 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:40:49,717 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:40:49,717 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000064 2018-06-18 19:40:49,717 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000064 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:40:49,717 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000064 2018-06-18 19:40:49,717 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:40:49,718 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:50,719 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:51,721 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:51,959 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000064] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000064),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:40:52,723 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:52,978 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000064 buffer server: laptop-name:46485 2018-06-18 19:40:53,249 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000064),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:40:53,726 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:54,729 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:55,186 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 61 2018-06-18 19:40:55,187 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:40:55,730 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000064 2018-06-18 19:40:55,730 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000064 2018-06-18 19:40:55,731 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:40:55,733 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:56,735 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000064, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:40:56,736 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000064@localhost:40317 2018-06-18 19:40:56,736 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:40:56,776 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:57,776 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:40:57,776 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:40:57,780 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:58,787 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000065, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority63 2018-06-18 19:40:58,787 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000064 2018-06-18 19:40:58,789 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000065 2018-06-18 19:40:58,789 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:40:58,839 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:40:58,839 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000065 2018-06-18 19:40:58,839 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000065 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:40:58,839 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000065 2018-06-18 19:40:58,840 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:40:58,844 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:40:59,846 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:00,848 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:00,911 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000065] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000065),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:41:01,850 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:01,930 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000065 buffer server: laptop-name:35635 2018-06-18 19:41:02,141 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000065),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:41:02,853 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:03,856 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:04,060 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 62 2018-06-18 19:41:04,061 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:41:04,857 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000065 2018-06-18 19:41:04,857 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000065 2018-06-18 19:41:04,859 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:41:04,860 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:05,863 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000065, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:41:05,863 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000065@localhost:40317 2018-06-18 19:41:05,863 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:41:05,906 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:06,907 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:41:06,907 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:41:06,911 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:07,918 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000066, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority64 2018-06-18 19:41:07,918 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000065 2018-06-18 19:41:07,919 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000066 2018-06-18 19:41:07,920 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:41:07,971 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:41:07,971 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000066 2018-06-18 19:41:07,972 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000066 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:41:07,972 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000066 2018-06-18 19:41:07,972 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:41:07,973 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:08,975 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:09,977 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:10,192 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000066] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000066),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:41:10,979 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:11,212 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000066 buffer server: laptop-name:35367 2018-06-18 19:41:11,450 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000066),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:41:11,981 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:12,984 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:13,374 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 63 2018-06-18 19:41:13,374 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:41:13,984 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000066 2018-06-18 19:41:13,984 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000066 2018-06-18 19:41:13,986 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:41:13,987 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:14,990 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000066, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:41:14,991 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000066@localhost:40317 2018-06-18 19:41:14,991 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:41:15,026 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:16,027 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:41:16,027 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:41:16,031 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:17,036 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000067, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority65 2018-06-18 19:41:17,036 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000066 2018-06-18 19:41:17,038 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000067 2018-06-18 19:41:17,038 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:41:17,087 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:41:17,087 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000067 2018-06-18 19:41:17,087 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000067 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:41:17,087 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000067 2018-06-18 19:41:17,088 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:41:17,088 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:18,090 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:19,092 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:19,231 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000067] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000067),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:41:20,094 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:20,256 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000067 buffer server: laptop-name:36043 2018-06-18 19:41:20,481 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000067),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:41:21,096 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:22,098 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:22,406 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 64 2018-06-18 19:41:22,407 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:41:23,099 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000067 2018-06-18 19:41:23,099 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000067 2018-06-18 19:41:23,101 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:41:23,103 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:24,106 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000067, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:41:24,106 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000067@localhost:40317 2018-06-18 19:41:24,106 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:41:24,155 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:25,156 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:41:25,156 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:41:25,159 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:26,165 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000068, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority66 2018-06-18 19:41:26,165 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000067 2018-06-18 19:41:26,166 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000068 2018-06-18 19:41:26,166 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:41:26,213 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:41:26,213 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000068 2018-06-18 19:41:26,213 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000068 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:41:26,214 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000068 2018-06-18 19:41:26,214 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:41:26,214 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:27,217 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:28,218 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:28,324 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000068] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000068),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:41:29,221 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:29,347 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000068 buffer server: laptop-name:42427 2018-06-18 19:41:29,635 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000068),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:41:30,223 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:31,225 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:31,558 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 65 2018-06-18 19:41:31,559 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:41:32,226 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000068 2018-06-18 19:41:32,226 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000068 2018-06-18 19:41:32,226 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:41:32,227 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:33,229 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000068, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:41:33,230 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000068@localhost:40317 2018-06-18 19:41:33,230 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:41:33,265 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:34,266 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:41:34,266 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:41:34,269 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:35,273 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000069, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority67 2018-06-18 19:41:35,274 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000068 2018-06-18 19:41:35,275 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000069 2018-06-18 19:41:35,275 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:41:35,306 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:41:35,306 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000069 2018-06-18 19:41:35,306 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000069 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:41:35,306 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000069 2018-06-18 19:41:35,307 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:41:35,307 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:36,309 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:37,316 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:37,412 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000069] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000069),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:41:38,317 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:38,431 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000069 buffer server: laptop-name:41671 2018-06-18 19:41:38,656 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000069),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:41:39,320 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:40,323 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:40,579 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 66 2018-06-18 19:41:40,579 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:41:41,324 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000069 2018-06-18 19:41:41,324 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000069 2018-06-18 19:41:41,325 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:41:41,327 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:42,330 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000069, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:41:42,330 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000069@localhost:40317 2018-06-18 19:41:42,330 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:41:42,359 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:43,359 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:41:43,359 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:41:43,362 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:44,368 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000070, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority68 2018-06-18 19:41:44,368 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000069 2018-06-18 19:41:44,369 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000070 2018-06-18 19:41:44,370 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:41:44,440 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:41:44,440 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000070 2018-06-18 19:41:44,441 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000070 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:41:44,441 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000070 2018-06-18 19:41:44,441 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:41:44,442 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:45,445 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:46,447 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:46,676 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000070] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000070),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:41:47,450 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:47,703 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000070 buffer server: laptop-name:45325 2018-06-18 19:41:47,962 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000070),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:41:48,452 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:49,454 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:49,915 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 67 2018-06-18 19:41:49,916 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:41:50,455 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000070 2018-06-18 19:41:50,455 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000070 2018-06-18 19:41:50,457 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:41:50,460 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:51,462 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000070, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:41:51,462 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000070@localhost:40317 2018-06-18 19:41:51,462 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:41:51,490 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:52,490 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:41:52,491 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:41:52,494 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:53,499 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000071, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority69 2018-06-18 19:41:53,500 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000070 2018-06-18 19:41:53,501 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000071 2018-06-18 19:41:53,501 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:41:53,554 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:41:53,554 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000071 2018-06-18 19:41:53,554 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000071 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:41:53,555 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000071 2018-06-18 19:41:53,555 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:41:53,555 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:54,561 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:55,563 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:55,679 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000071] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000071),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:41:56,565 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:56,699 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000071 buffer server: laptop-name:46773 2018-06-18 19:41:56,999 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000071),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:41:57,568 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:58,571 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:41:58,885 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 68 2018-06-18 19:41:58,886 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:41:59,572 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000071 2018-06-18 19:41:59,572 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000071 2018-06-18 19:41:59,574 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:41:59,575 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:00,578 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000071, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:42:00,578 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000071@localhost:40317 2018-06-18 19:42:00,578 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:42:00,615 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:01,615 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:42:01,615 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:42:01,618 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:02,624 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000072, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority70 2018-06-18 19:42:02,624 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000071 2018-06-18 19:42:02,626 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000072 2018-06-18 19:42:02,626 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:42:02,661 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:42:02,661 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000072 2018-06-18 19:42:02,661 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000072 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:42:02,662 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000072 2018-06-18 19:42:02,662 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:42:02,662 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:03,667 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:04,668 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:04,792 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000072] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000072),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:42:05,671 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:05,811 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000072 buffer server: laptop-name:38865 2018-06-18 19:42:06,064 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000072),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:42:06,674 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:07,677 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:07,975 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 69 2018-06-18 19:42:07,976 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:42:08,678 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000072 2018-06-18 19:42:08,678 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000072 2018-06-18 19:42:08,680 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:42:08,681 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:09,684 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000072, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:42:09,684 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000072@localhost:40317 2018-06-18 19:42:09,685 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:42:09,722 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:10,722 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:42:10,723 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:42:10,726 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:11,732 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000073, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority71 2018-06-18 19:42:11,733 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000072 2018-06-18 19:42:11,734 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000073 2018-06-18 19:42:11,734 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:42:11,774 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:42:11,774 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000073 2018-06-18 19:42:11,774 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000073 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:42:11,774 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000073 2018-06-18 19:42:11,774 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:42:11,775 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:12,778 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:13,780 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:13,832 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000073] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000073),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:42:14,783 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:14,842 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000073 buffer server: laptop-name:46099 2018-06-18 19:42:15,072 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000073),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:42:15,785 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:16,788 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:16,968 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 70 2018-06-18 19:42:16,969 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:42:17,789 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000073 2018-06-18 19:42:17,789 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000073 2018-06-18 19:42:17,791 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:42:17,792 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:18,795 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000073, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:42:18,795 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000073@localhost:40317 2018-06-18 19:42:18,795 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:42:18,830 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:19,830 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:42:19,830 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:42:19,833 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:20,841 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000074, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority72 2018-06-18 19:42:20,842 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000073 2018-06-18 19:42:20,843 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000074 2018-06-18 19:42:20,844 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:42:20,890 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:42:20,890 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000074 2018-06-18 19:42:20,890 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000074 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:42:20,890 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000074 2018-06-18 19:42:20,891 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:42:20,892 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:21,897 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:22,898 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:22,931 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000074] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000074),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:42:23,901 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:23,952 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000074 buffer server: laptop-name:44077 2018-06-18 19:42:24,210 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000074),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:42:24,904 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:25,912 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:26,149 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 71 2018-06-18 19:42:26,149 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:42:26,913 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000074 2018-06-18 19:42:26,913 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000074 2018-06-18 19:42:26,915 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:42:26,918 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:27,921 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000074, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:42:27,921 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000074@localhost:40317 2018-06-18 19:42:27,921 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:42:27,956 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:28,956 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:42:28,956 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:42:28,961 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:29,966 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000075, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority73 2018-06-18 19:42:29,967 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000074 2018-06-18 19:42:29,968 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000075 2018-06-18 19:42:29,968 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:42:30,013 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:42:30,013 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000075 2018-06-18 19:42:30,013 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000075 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:42:30,014 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000075 2018-06-18 19:42:30,014 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:42:30,014 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:31,016 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:32,018 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:32,152 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000075] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000075),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:42:33,020 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:33,171 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000075 buffer server: laptop-name:41833 2018-06-18 19:42:33,454 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000075),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:42:34,023 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:35,026 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:35,380 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 72 2018-06-18 19:42:35,380 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:42:36,027 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000075 2018-06-18 19:42:36,027 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000075 2018-06-18 19:42:36,027 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:42:36,028 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:37,030 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000075, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:42:37,030 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000075@localhost:40317 2018-06-18 19:42:37,031 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:42:37,066 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:38,066 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:42:38,067 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:42:38,070 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:39,075 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000076, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority74 2018-06-18 19:42:39,075 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000075 2018-06-18 19:42:39,076 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000076 2018-06-18 19:42:39,077 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:42:39,111 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:42:39,111 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000076 2018-06-18 19:42:39,112 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000076 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:42:39,112 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000076 2018-06-18 19:42:39,112 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:42:39,113 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:40,116 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:41,117 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:41,212 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000076] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000076),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:42:42,119 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:42,231 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000076 buffer server: laptop-name:37547 2018-06-18 19:42:42,456 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000076),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:42:43,122 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:44,125 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:44,380 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 73 2018-06-18 19:42:44,381 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:42:45,126 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000076 2018-06-18 19:42:45,126 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000076 2018-06-18 19:42:45,127 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:42:45,129 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:46,132 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000076, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:42:46,132 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000076@localhost:40317 2018-06-18 19:42:46,132 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:42:46,168 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:47,169 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:42:47,169 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:42:47,172 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:48,177 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000077, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority75 2018-06-18 19:42:48,178 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000076 2018-06-18 19:42:48,179 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000077 2018-06-18 19:42:48,179 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:42:48,220 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:42:48,220 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000077 2018-06-18 19:42:48,220 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000077 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:42:48,221 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000077 2018-06-18 19:42:48,221 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:42:48,221 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:49,224 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:50,226 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:50,265 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000077] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000077),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:42:51,228 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:51,284 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000077 buffer server: laptop-name:40569 2018-06-18 19:42:51,541 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000077),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:42:52,231 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:53,234 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:53,468 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 74 2018-06-18 19:42:53,468 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:42:54,234 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000077 2018-06-18 19:42:54,234 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000077 2018-06-18 19:42:54,236 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:42:54,240 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:55,242 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000077, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:42:55,243 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000077@localhost:40317 2018-06-18 19:42:55,243 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:42:55,281 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:56,282 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:42:56,282 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:42:56,285 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:57,290 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000078, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority76 2018-06-18 19:42:57,290 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000077 2018-06-18 19:42:57,292 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000078 2018-06-18 19:42:57,292 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:42:57,339 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:42:57,340 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000078 2018-06-18 19:42:57,340 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000078 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:42:57,340 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000078 2018-06-18 19:42:57,340 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:42:57,341 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:58,343 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:59,345 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:42:59,408 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000078] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000078),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:43:00,347 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:00,427 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000078 buffer server: laptop-name:38015 2018-06-18 19:43:00,686 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000078),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:43:01,350 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:02,353 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:02,613 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 75 2018-06-18 19:43:02,614 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:43:03,353 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000078 2018-06-18 19:43:03,354 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000078 2018-06-18 19:43:03,355 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:43:03,357 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:04,360 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000078, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:43:04,360 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000078@localhost:40317 2018-06-18 19:43:04,360 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:43:04,399 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:05,400 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:43:05,400 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:43:05,404 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:06,410 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000079, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority77 2018-06-18 19:43:06,410 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000078 2018-06-18 19:43:06,412 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000079 2018-06-18 19:43:06,412 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:43:06,473 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:43:06,473 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000079 2018-06-18 19:43:06,474 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000079 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:43:06,474 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000079 2018-06-18 19:43:06,474 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:43:06,475 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:07,476 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:08,478 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:08,533 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000079] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000079),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:43:09,481 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:09,552 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000079 buffer server: laptop-name:33615 2018-06-18 19:43:09,771 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000079),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:43:10,484 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:11,487 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:11,693 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 76 2018-06-18 19:43:11,693 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:43:12,487 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000079 2018-06-18 19:43:12,487 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000079 2018-06-18 19:43:12,489 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:43:12,490 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:13,493 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000079, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:43:13,493 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000079@localhost:40317 2018-06-18 19:43:13,493 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:43:13,541 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:14,542 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:43:14,542 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:43:14,546 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:15,551 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000080, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority78 2018-06-18 19:43:15,551 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000079 2018-06-18 19:43:15,553 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000080 2018-06-18 19:43:15,553 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:43:15,632 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:43:15,632 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000080 2018-06-18 19:43:15,632 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000080 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:43:15,633 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:15,633 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000080 2018-06-18 19:43:15,633 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:43:16,634 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:17,636 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:18,375 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000080] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000080),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:43:18,638 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:19,415 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000080 buffer server: laptop-name:38029 2018-06-18 19:43:19,640 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:19,738 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000080),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:43:20,642 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:21,642 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 77 2018-06-18 19:43:21,642 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:43:21,642 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000080 2018-06-18 19:43:21,642 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000080 2018-06-18 19:43:21,643 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:43:21,644 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:22,646 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000080, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:43:22,646 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000080@localhost:40317 2018-06-18 19:43:22,647 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:43:22,676 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:23,677 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:43:23,677 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:43:23,680 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:24,683 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000081, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority79 2018-06-18 19:43:24,684 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000080 2018-06-18 19:43:24,684 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000081 2018-06-18 19:43:24,684 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:43:24,711 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:43:24,711 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000081 2018-06-18 19:43:24,711 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000081 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:43:24,711 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000081 2018-06-18 19:43:24,711 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:43:24,712 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:25,714 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:26,715 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:26,900 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000081] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000081),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:43:27,717 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:27,920 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000081 buffer server: laptop-name:40887 2018-06-18 19:43:28,180 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000081),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:43:28,718 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:29,721 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:30,104 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 78 2018-06-18 19:43:30,104 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:43:30,721 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000081 2018-06-18 19:43:30,722 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000081 2018-06-18 19:43:30,724 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:43:30,726 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:31,728 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000081, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:43:31,728 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000081@localhost:40317 2018-06-18 19:43:31,729 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:43:31,770 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:32,770 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:43:32,770 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:43:32,774 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:33,780 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000082, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority80 2018-06-18 19:43:33,780 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000081 2018-06-18 19:43:33,781 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000082 2018-06-18 19:43:33,781 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:43:33,821 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:43:33,821 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000082 2018-06-18 19:43:33,822 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000082 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:43:33,822 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000082 2018-06-18 19:43:33,822 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:43:33,824 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:34,826 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:35,828 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:35,898 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000082] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000082),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:43:36,830 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:36,917 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000082 buffer server: laptop-name:36325 2018-06-18 19:43:37,117 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000082),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:43:37,832 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:38,834 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:39,033 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 79 2018-06-18 19:43:39,034 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:43:39,834 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000082 2018-06-18 19:43:39,835 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000082 2018-06-18 19:43:39,836 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:43:39,838 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:40,840 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000082, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:43:40,840 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000082@localhost:40317 2018-06-18 19:43:40,840 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:43:40,873 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:41,874 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:43:41,874 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:43:41,877 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:42,883 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000083, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority81 2018-06-18 19:43:42,884 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000082 2018-06-18 19:43:42,885 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000083 2018-06-18 19:43:42,885 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:43:42,929 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:43:42,929 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000083 2018-06-18 19:43:42,930 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000083 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:43:42,930 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000083 2018-06-18 19:43:42,930 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:43:42,936 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:43,956 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:44,957 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:45,150 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000083] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000083),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:43:45,960 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:46,171 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000083 buffer server: laptop-name:39855 2018-06-18 19:43:46,398 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000083),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:43:46,963 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:47,967 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:48,329 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 80 2018-06-18 19:43:48,330 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:43:48,967 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000083 2018-06-18 19:43:48,968 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000083 2018-06-18 19:43:48,969 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:43:48,971 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:49,974 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000083, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:43:49,974 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000083@localhost:40317 2018-06-18 19:43:49,974 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:43:50,011 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:51,012 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:43:51,012 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:43:51,013 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:52,019 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000084, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority82 2018-06-18 19:43:52,019 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000083 2018-06-18 19:43:52,021 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000084 2018-06-18 19:43:52,021 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:43:52,071 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:43:52,071 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000084 2018-06-18 19:43:52,072 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000084 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:43:52,072 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000084 2018-06-18 19:43:52,072 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:43:52,074 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:53,076 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:54,077 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:54,210 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000084] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000084),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:43:55,080 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:55,230 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000084 buffer server: laptop-name:43111 2018-06-18 19:43:55,452 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000084),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:43:56,083 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:57,086 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:57,390 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 81 2018-06-18 19:43:57,390 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:43:58,087 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000084 2018-06-18 19:43:58,087 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000084 2018-06-18 19:43:58,088 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:43:58,091 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:43:59,093 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000084, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:43:59,094 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000084@localhost:40317 2018-06-18 19:43:59,094 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:43:59,127 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:00,128 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:44:00,128 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:44:00,131 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:01,136 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000085, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority83 2018-06-18 19:44:01,136 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000084 2018-06-18 19:44:01,137 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000085 2018-06-18 19:44:01,137 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:44:01,174 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:44:01,174 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000085 2018-06-18 19:44:01,174 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000085 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:44:01,174 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000085 2018-06-18 19:44:01,174 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:44:01,179 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:02,180 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:03,182 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:03,308 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000085] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000085),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:44:04,185 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:04,327 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000085 buffer server: laptop-name:34579 2018-06-18 19:44:04,591 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000085),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:44:05,187 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:06,190 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:06,520 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 82 2018-06-18 19:44:06,521 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:44:07,191 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000085 2018-06-18 19:44:07,191 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000085 2018-06-18 19:44:07,193 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:44:07,194 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:08,197 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000085, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:44:08,197 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000085@localhost:40317 2018-06-18 19:44:08,197 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:44:08,235 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:09,235 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:44:09,235 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:44:09,238 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:10,244 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000086, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority84 2018-06-18 19:44:10,244 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000085 2018-06-18 19:44:10,246 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000086 2018-06-18 19:44:10,246 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:44:10,290 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:44:10,290 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000086 2018-06-18 19:44:10,290 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000086 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:44:10,290 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000086 2018-06-18 19:44:10,290 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:44:10,292 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:11,294 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:12,295 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:12,346 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000086] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000086),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:44:13,298 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:13,365 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000086 buffer server: laptop-name:37751 2018-06-18 19:44:13,619 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000086),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:44:14,301 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:15,304 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:15,542 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 83 2018-06-18 19:44:15,542 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:44:16,304 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000086 2018-06-18 19:44:16,305 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000086 2018-06-18 19:44:16,306 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:44:16,307 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:17,309 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000086, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:44:17,309 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000086@localhost:40317 2018-06-18 19:44:17,310 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:44:17,344 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:18,344 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:44:18,345 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:44:18,349 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:19,358 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000087, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority85 2018-06-18 19:44:19,358 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000086 2018-06-18 19:44:19,359 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000087 2018-06-18 19:44:19,360 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:44:19,406 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:44:19,407 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000087 2018-06-18 19:44:19,407 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000087 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:44:19,408 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000087 2018-06-18 19:44:19,408 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:44:19,409 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:20,425 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:21,426 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:21,531 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000087] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000087),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:44:22,429 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:22,550 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000087 buffer server: laptop-name:36277 2018-06-18 19:44:22,771 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000087),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:44:23,432 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:24,435 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:24,696 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 84 2018-06-18 19:44:24,697 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:44:25,436 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000087 2018-06-18 19:44:25,437 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000087 2018-06-18 19:44:25,438 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:44:25,440 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:26,444 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000087, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:44:26,444 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000087@localhost:40317 2018-06-18 19:44:26,445 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:44:26,479 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:27,480 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:44:27,480 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:44:27,484 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:28,490 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000088, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority86 2018-06-18 19:44:28,490 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000087 2018-06-18 19:44:28,492 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000088 2018-06-18 19:44:28,492 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:44:28,535 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:44:28,535 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000088 2018-06-18 19:44:28,535 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000088 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:44:28,535 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000088 2018-06-18 19:44:28,535 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:44:28,536 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:29,538 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:30,540 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:30,780 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000088] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000088),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:44:31,542 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:31,797 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000088 buffer server: laptop-name:45081 2018-06-18 19:44:32,100 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000088),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:44:32,543 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:33,546 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:34,013 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 85 2018-06-18 19:44:34,013 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:44:34,547 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000088 2018-06-18 19:44:34,547 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000088 2018-06-18 19:44:34,548 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:44:34,551 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:35,552 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000088, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:44:35,552 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000088@localhost:40317 2018-06-18 19:44:35,552 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:44:35,570 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:36,571 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:44:36,571 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:44:36,574 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:37,579 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000089, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority87 2018-06-18 19:44:37,580 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000088 2018-06-18 19:44:37,581 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000089 2018-06-18 19:44:37,581 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:44:37,625 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:44:37,626 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000089 2018-06-18 19:44:37,626 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000089 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:44:37,626 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000089 2018-06-18 19:44:37,626 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:44:37,629 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:38,631 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:39,634 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:39,714 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000089] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000089),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:44:40,636 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:40,733 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000089 buffer server: laptop-name:40401 2018-06-18 19:44:40,941 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000089),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:44:41,639 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:42,641 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:42,857 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 86 2018-06-18 19:44:42,857 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:44:43,642 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000089 2018-06-18 19:44:43,642 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000089 2018-06-18 19:44:43,645 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:44:43,646 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:44,648 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000089, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:44:44,648 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000089@localhost:40317 2018-06-18 19:44:44,648 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:44:44,679 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:45,679 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:44:45,679 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:44:45,682 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:46,689 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000090, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority88 2018-06-18 19:44:46,689 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000089 2018-06-18 19:44:46,690 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000090 2018-06-18 19:44:46,691 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:44:46,735 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:44:46,735 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000090 2018-06-18 19:44:46,735 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000090 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:44:46,735 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000090 2018-06-18 19:44:46,735 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:44:46,736 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:47,737 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:48,739 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:48,833 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000090] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000090),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:44:49,742 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:49,845 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000090 buffer server: laptop-name:44399 2018-06-18 19:44:50,062 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000090),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:44:50,744 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:51,747 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:51,991 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 87 2018-06-18 19:44:51,992 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:44:52,747 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000090 2018-06-18 19:44:52,748 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000090 2018-06-18 19:44:52,749 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:44:52,750 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:53,752 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000090, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:44:53,752 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000090@localhost:40317 2018-06-18 19:44:53,753 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:44:53,799 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:54,800 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:44:54,800 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:44:54,804 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:55,809 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000091, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority89 2018-06-18 19:44:55,809 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000090 2018-06-18 19:44:55,810 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000091 2018-06-18 19:44:55,810 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:44:55,854 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:44:55,854 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000091 2018-06-18 19:44:55,854 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000091 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:44:55,854 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000091 2018-06-18 19:44:55,855 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:44:55,856 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:56,858 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:57,859 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:58,011 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000091] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000091),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:44:58,862 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:44:59,032 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000091 buffer server: laptop-name:44421 2018-06-18 19:44:59,260 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000091),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:44:59,866 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:00,869 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:01,174 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 88 2018-06-18 19:45:01,174 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:45:01,869 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000091 2018-06-18 19:45:01,870 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000091 2018-06-18 19:45:01,871 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:45:01,873 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:02,876 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000091, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:45:02,876 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000091@localhost:40317 2018-06-18 19:45:02,876 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:45:02,920 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:03,921 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:45:03,921 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:45:03,925 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:04,931 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000092, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority90 2018-06-18 19:45:04,931 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000091 2018-06-18 19:45:04,934 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000092 2018-06-18 19:45:04,934 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:45:04,985 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:45:04,985 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000092 2018-06-18 19:45:04,985 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000092 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:45:04,985 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000092 2018-06-18 19:45:04,986 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:45:04,986 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:05,988 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:06,990 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:07,131 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000092] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000092),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:45:07,992 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:08,150 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000092 buffer server: laptop-name:36599 2018-06-18 19:45:08,409 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000092),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:45:08,995 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:09,998 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:10,323 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 89 2018-06-18 19:45:10,323 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:45:10,999 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000092 2018-06-18 19:45:10,999 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000092 2018-06-18 19:45:11,000 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:45:11,002 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:12,005 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000092, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:45:12,005 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000092@localhost:40317 2018-06-18 19:45:12,005 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:45:12,042 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:13,043 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:45:13,043 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:45:13,046 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:14,051 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000093, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority91 2018-06-18 19:45:14,052 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000092 2018-06-18 19:45:14,053 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000093 2018-06-18 19:45:14,053 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:45:14,099 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:45:14,099 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000093 2018-06-18 19:45:14,099 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000093 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:45:14,099 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000093 2018-06-18 19:45:14,099 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:45:14,100 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:15,106 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:16,107 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:16,239 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000093] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000093),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:45:17,110 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:17,260 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000093 buffer server: laptop-name:33513 2018-06-18 19:45:17,525 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000093),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:45:18,113 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:19,116 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:19,464 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 90 2018-06-18 19:45:19,465 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:45:20,117 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000093 2018-06-18 19:45:20,117 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000093 2018-06-18 19:45:20,118 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:45:20,119 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:21,121 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000093, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:45:21,121 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000093@localhost:40317 2018-06-18 19:45:21,121 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:45:21,138 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:22,139 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:45:22,139 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:45:22,143 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:23,148 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000094, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority92 2018-06-18 19:45:23,148 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000093 2018-06-18 19:45:23,149 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000094 2018-06-18 19:45:23,149 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:45:23,196 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:45:23,196 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000094 2018-06-18 19:45:23,196 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000094 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:45:23,197 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000094 2018-06-18 19:45:23,197 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:45:23,198 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:24,200 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:25,202 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:25,330 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000094] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000094),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:45:26,204 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:26,349 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000094 buffer server: laptop-name:38047 2018-06-18 19:45:26,569 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000094),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:45:27,206 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:28,209 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:28,483 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 91 2018-06-18 19:45:28,484 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:45:29,210 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000094 2018-06-18 19:45:29,210 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000094 2018-06-18 19:45:29,212 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:45:29,213 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:30,215 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000094, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:45:30,215 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000094@localhost:40317 2018-06-18 19:45:30,216 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:45:30,259 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:31,259 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:45:31,260 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:45:31,262 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:32,267 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000095, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority93 2018-06-18 19:45:32,268 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000094 2018-06-18 19:45:32,269 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000095 2018-06-18 19:45:32,269 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:45:32,310 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:45:32,310 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000095 2018-06-18 19:45:32,310 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000095 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:45:32,310 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000095 2018-06-18 19:45:32,310 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:45:32,313 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:33,316 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:34,318 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:34,544 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000095] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000095),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:45:35,321 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:35,563 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000095 buffer server: laptop-name:35649 2018-06-18 19:45:35,809 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000095),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:45:36,324 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:37,326 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:37,717 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 92 2018-06-18 19:45:37,717 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:45:38,327 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000095 2018-06-18 19:45:38,327 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000095 2018-06-18 19:45:38,329 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:45:38,330 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:39,333 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000095, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:45:39,333 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000095@localhost:40317 2018-06-18 19:45:39,333 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:45:39,369 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:40,369 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:45:40,369 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:45:40,373 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:41,381 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000096, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority94 2018-06-18 19:45:41,381 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000095 2018-06-18 19:45:41,383 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000096 2018-06-18 19:45:41,383 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:45:41,430 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:45:41,430 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000096 2018-06-18 19:45:41,430 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000096 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:45:41,431 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000096 2018-06-18 19:45:41,431 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:45:41,432 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:42,434 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:43,436 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:43,740 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000096] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000096),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:45:44,439 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:44,758 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000096 buffer server: laptop-name:44345 2018-06-18 19:45:44,998 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000096),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:45:45,442 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:46,444 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:46,923 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 93 2018-06-18 19:45:46,924 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:45:47,444 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000096 2018-06-18 19:45:47,444 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000096 2018-06-18 19:45:47,445 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:45:47,446 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:48,447 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000096, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:45:48,447 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000096@localhost:40317 2018-06-18 19:45:48,447 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:45:48,471 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:49,471 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:45:49,471 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:45:49,472 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:50,477 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000097, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority95 2018-06-18 19:45:50,477 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000096 2018-06-18 19:45:50,478 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000097 2018-06-18 19:45:50,478 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:45:50,519 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:45:50,519 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000097 2018-06-18 19:45:50,520 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000097 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:45:50,520 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000097 2018-06-18 19:45:50,520 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:45:50,523 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:51,525 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:52,526 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:52,700 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000097] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000097),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:45:53,529 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:53,713 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000097 buffer server: laptop-name:38663 2018-06-18 19:45:53,936 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000097),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:45:54,532 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:55,534 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:55,861 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 94 2018-06-18 19:45:55,861 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:45:56,534 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000097 2018-06-18 19:45:56,535 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000097 2018-06-18 19:45:56,536 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:45:56,537 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:57,540 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000097, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:45:57,540 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000097@localhost:40317 2018-06-18 19:45:57,541 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:45:57,577 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:58,578 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:45:58,578 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:45:58,581 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:45:59,586 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000098, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority96 2018-06-18 19:45:59,587 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000097 2018-06-18 19:45:59,588 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000098 2018-06-18 19:45:59,588 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:45:59,635 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:45:59,635 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000098 2018-06-18 19:45:59,635 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000098 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:45:59,635 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000098 2018-06-18 19:45:59,636 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:45:59,638 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:00,642 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:01,643 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:01,770 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000098] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000098),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:46:02,646 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:02,789 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000098 buffer server: laptop-name:33849 2018-06-18 19:46:03,038 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000098),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:46:03,649 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:04,652 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:04,956 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 95 2018-06-18 19:46:04,957 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:46:05,653 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000098 2018-06-18 19:46:05,653 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000098 2018-06-18 19:46:05,654 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:46:05,656 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:06,659 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000098, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:46:06,659 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000098@localhost:40317 2018-06-18 19:46:06,659 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:46:06,696 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:07,696 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:46:07,697 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:46:07,700 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:08,706 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000099, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority97 2018-06-18 19:46:08,706 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000098 2018-06-18 19:46:08,707 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000099 2018-06-18 19:46:08,707 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:46:08,746 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:46:08,747 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000099 2018-06-18 19:46:08,747 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000099 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:46:08,747 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000099 2018-06-18 19:46:08,747 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:46:08,747 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:09,749 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:10,751 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:10,833 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000099] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000099),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:46:11,754 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:11,853 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000099 buffer server: laptop-name:38991 2018-06-18 19:46:12,117 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000099),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:46:12,757 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:13,760 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:14,043 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 96 2018-06-18 19:46:14,044 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:46:14,760 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000099 2018-06-18 19:46:14,760 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000099 2018-06-18 19:46:14,762 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:46:14,765 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:15,767 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000099, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:46:15,768 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000099@localhost:40317 2018-06-18 19:46:15,768 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:46:15,804 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:16,805 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:46:16,805 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:46:16,808 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:17,813 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000100, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority98 2018-06-18 19:46:17,813 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000099 2018-06-18 19:46:17,814 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000100 2018-06-18 19:46:17,814 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:46:17,860 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:46:17,860 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000100 2018-06-18 19:46:17,860 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000100 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:46:17,860 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000100 2018-06-18 19:46:17,861 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:46:17,862 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:18,865 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:19,866 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:20,148 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000100] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000100),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:46:20,869 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:21,169 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000100 buffer server: laptop-name:37933 2018-06-18 19:46:21,422 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000100),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:46:21,870 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:22,873 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:23,342 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 97 2018-06-18 19:46:23,343 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:46:23,874 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000100 2018-06-18 19:46:23,874 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000100 2018-06-18 19:46:23,876 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:46:23,877 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:24,880 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000100, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:46:24,880 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000100@localhost:40317 2018-06-18 19:46:24,880 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:46:24,915 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:25,915 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:46:25,915 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:46:25,919 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:26,925 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000101, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority99 2018-06-18 19:46:26,925 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000100 2018-06-18 19:46:26,926 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000101 2018-06-18 19:46:26,926 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:46:26,937 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:46:26,937 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000101 2018-06-18 19:46:26,937 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000101 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:46:26,937 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000101 2018-06-18 19:46:26,937 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:46:26,937 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:27,940 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:28,941 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:29,162 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000101] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000101),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:46:29,945 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:30,192 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000101 buffer server: laptop-name:38305 2018-06-18 19:46:30,484 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000101),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:46:30,946 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:31,948 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:32,409 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 98 2018-06-18 19:46:32,410 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:46:32,949 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000101 2018-06-18 19:46:32,949 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000101 2018-06-18 19:46:32,950 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:46:32,953 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:33,955 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000101, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:46:33,955 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000101@localhost:40317 2018-06-18 19:46:33,956 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:46:33,997 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:34,997 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:46:34,997 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:46:34,998 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:36,003 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000102, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority100 2018-06-18 19:46:36,003 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000101 2018-06-18 19:46:36,004 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000102 2018-06-18 19:46:36,004 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:46:36,049 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:46:36,049 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000102 2018-06-18 19:46:36,049 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000102 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:46:36,050 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000102 2018-06-18 19:46:36,050 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:46:36,053 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:37,055 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:38,057 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:38,436 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000102] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000102),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:46:39,060 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:39,456 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000102 buffer server: laptop-name:35187 2018-06-18 19:46:39,721 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000102),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:46:40,062 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:41,065 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:41,635 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 99 2018-06-18 19:46:41,635 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:46:42,066 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000102 2018-06-18 19:46:42,066 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000102 2018-06-18 19:46:42,068 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:46:42,069 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:43,070 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000102, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:46:43,070 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000102@localhost:40317 2018-06-18 19:46:43,071 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:46:43,092 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:44,092 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:46:44,092 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:46:44,096 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:45,101 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000103, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority101 2018-06-18 19:46:45,101 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000102 2018-06-18 19:46:45,103 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000103 2018-06-18 19:46:45,103 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:46:45,143 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:46:45,143 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000103 2018-06-18 19:46:45,144 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000103 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:46:45,144 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000103 2018-06-18 19:46:45,144 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:46:45,146 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:46,148 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:47,150 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:47,325 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000103] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000103),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:46:48,152 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:48,344 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000103 buffer server: laptop-name:36409 2018-06-18 19:46:48,594 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000103),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:46:49,154 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:50,155 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:50,519 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 100 2018-06-18 19:46:50,520 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:46:51,155 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000103 2018-06-18 19:46:51,155 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000103 2018-06-18 19:46:51,156 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:46:51,157 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:52,158 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000103, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:46:52,158 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000103@localhost:40317 2018-06-18 19:46:52,158 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:46:52,177 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:53,177 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:46:53,177 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:46:53,179 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:54,183 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000104, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority102 2018-06-18 19:46:54,184 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000103 2018-06-18 19:46:54,185 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000104 2018-06-18 19:46:54,185 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:46:54,228 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:46:54,228 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000104 2018-06-18 19:46:54,228 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000104 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:46:54,229 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000104 2018-06-18 19:46:54,229 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:46:54,230 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:55,232 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:56,234 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:56,332 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000104] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000104),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:46:57,237 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:57,351 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000104 buffer server: laptop-name:35537 2018-06-18 19:46:57,614 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000104),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:46:58,240 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:59,243 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:46:59,542 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 101 2018-06-18 19:46:59,543 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:47:00,244 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000104 2018-06-18 19:47:00,244 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000104 2018-06-18 19:47:00,246 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:47:00,248 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:01,250 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000104, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:47:01,250 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000104@localhost:40317 2018-06-18 19:47:01,250 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:47:01,283 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:02,283 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:47:02,283 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:47:02,287 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:03,291 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000105, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority103 2018-06-18 19:47:03,291 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000104 2018-06-18 19:47:03,292 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000105 2018-06-18 19:47:03,293 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:47:03,332 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:47:03,333 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000105 2018-06-18 19:47:03,333 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000105 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:47:03,333 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000105 2018-06-18 19:47:03,333 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:47:03,334 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:04,336 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:05,338 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:05,538 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000105] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000105),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:47:06,340 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:06,557 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000105 buffer server: laptop-name:44623 2018-06-18 19:47:06,854 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000105),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:47:07,342 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:08,345 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:08,765 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 102 2018-06-18 19:47:08,766 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:47:09,345 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000105 2018-06-18 19:47:09,346 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000105 2018-06-18 19:47:09,347 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:47:09,349 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:10,351 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000105, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:47:10,351 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000105@localhost:40317 2018-06-18 19:47:10,351 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:47:10,385 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:11,386 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:47:11,386 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:47:11,389 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:12,394 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000106, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority104 2018-06-18 19:47:12,394 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000105 2018-06-18 19:47:12,396 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000106 2018-06-18 19:47:12,396 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:47:12,439 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:47:12,439 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000106 2018-06-18 19:47:12,439 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000106 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:47:12,440 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000106 2018-06-18 19:47:12,440 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:47:12,441 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:13,443 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:14,445 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:14,540 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000106] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000106),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:47:15,448 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:15,559 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000106 buffer server: laptop-name:39641 2018-06-18 19:47:15,788 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000106),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:47:16,449 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:17,452 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:17,711 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 103 2018-06-18 19:47:17,712 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:47:18,453 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000106 2018-06-18 19:47:18,453 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000106 2018-06-18 19:47:18,454 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:47:18,457 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:19,460 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000106, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:47:19,460 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000106@localhost:40317 2018-06-18 19:47:19,460 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:47:19,498 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:20,499 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:47:20,499 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:47:20,503 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:21,508 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000107, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority105 2018-06-18 19:47:21,508 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000106 2018-06-18 19:47:21,509 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000107 2018-06-18 19:47:21,510 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:47:21,563 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:47:21,564 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000107 2018-06-18 19:47:21,564 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000107 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:47:21,564 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000107 2018-06-18 19:47:21,564 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:47:21,566 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:22,568 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:23,570 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:24,138 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000107] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000107),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:47:24,571 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:25,150 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000107 buffer server: laptop-name:41371 2018-06-18 19:47:25,571 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000107),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:47:25,573 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:26,576 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:27,440 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 104 2018-06-18 19:47:27,440 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:47:27,576 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000107 2018-06-18 19:47:27,576 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000107 2018-06-18 19:47:27,578 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:47:27,581 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:28,583 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000107, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:47:28,584 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000107@localhost:40317 2018-06-18 19:47:28,584 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:47:28,625 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:29,625 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:47:29,625 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:47:29,629 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:30,634 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000108, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority106 2018-06-18 19:47:30,634 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000107 2018-06-18 19:47:30,636 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000108 2018-06-18 19:47:30,636 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:47:30,679 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:47:30,679 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000108 2018-06-18 19:47:30,680 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000108 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:47:30,680 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000108 2018-06-18 19:47:30,680 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:47:30,682 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:31,690 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:32,692 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:33,693 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:34,119 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000108] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000108),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:47:34,696 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:35,137 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000108 buffer server: laptop-name:46427 2018-06-18 19:47:35,421 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000108),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:47:35,697 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:36,700 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:37,332 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 105 2018-06-18 19:47:37,333 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:47:37,701 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000108 2018-06-18 19:47:37,701 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000108 2018-06-18 19:47:37,702 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:47:37,703 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:38,705 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000108, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:47:38,706 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000108@localhost:40317 2018-06-18 19:47:38,706 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:47:38,738 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:39,739 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:47:39,739 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:47:39,744 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:40,753 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000109, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority107 2018-06-18 19:47:40,753 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000108 2018-06-18 19:47:40,755 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000109 2018-06-18 19:47:40,755 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:47:40,826 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:47:40,826 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000109 2018-06-18 19:47:40,826 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000109 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:47:40,827 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000109 2018-06-18 19:47:40,827 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:47:40,828 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:41,830 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:42,831 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:43,833 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:43,972 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000109] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000109),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:47:44,836 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:45,001 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000109 buffer server: laptop-name:41459 2018-06-18 19:47:45,408 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000109),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:47:45,839 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:46,841 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:47,218 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 106 2018-06-18 19:47:47,219 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:47:47,841 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000109 2018-06-18 19:47:47,841 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000109 2018-06-18 19:47:47,843 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:47:47,844 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:48,847 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000109, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:47:48,847 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000109@localhost:40317 2018-06-18 19:47:48,848 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:47:48,913 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:49,913 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:47:49,913 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:47:49,915 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:50,918 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000110, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority108 2018-06-18 19:47:50,918 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000109 2018-06-18 19:47:50,919 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000110 2018-06-18 19:47:50,919 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:47:50,940 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:47:50,940 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000110 2018-06-18 19:47:50,940 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000110 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:47:50,940 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000110 2018-06-18 19:47:50,941 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:47:50,941 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:51,943 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:52,944 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:53,649 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000110] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000110),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:47:53,948 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:54,678 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000110 buffer server: laptop-name:39119 2018-06-18 19:47:54,942 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000110),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:47:54,950 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:55,951 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:56,861 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 107 2018-06-18 19:47:56,862 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:47:56,951 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000110 2018-06-18 19:47:56,952 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000110 2018-06-18 19:47:56,953 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:56,953 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:47:57,955 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000110, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:47:57,955 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000110@localhost:40317 2018-06-18 19:47:57,955 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:47:57,990 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:47:58,990 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:47:58,991 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:47:58,994 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:00,000 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000111, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority109 2018-06-18 19:48:00,000 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000110 2018-06-18 19:48:00,001 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000111 2018-06-18 19:48:00,001 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:48:00,047 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:48:00,048 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000111 2018-06-18 19:48:00,048 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000111 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:48:00,048 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000111 2018-06-18 19:48:00,048 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:48:00,049 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:01,052 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:02,053 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:02,188 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000111] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000111),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:48:03,056 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:03,207 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000111 buffer server: laptop-name:43497 2018-06-18 19:48:03,501 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000111),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:48:04,058 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:05,061 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:05,429 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 108 2018-06-18 19:48:05,430 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:48:06,061 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000111 2018-06-18 19:48:06,061 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000111 2018-06-18 19:48:06,063 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:48:06,065 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:07,067 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000111, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:48:07,067 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000111@localhost:40317 2018-06-18 19:48:07,068 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:48:07,123 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:08,123 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:48:08,124 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:48:08,129 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:09,131 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000112, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority110 2018-06-18 19:48:09,131 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000111 2018-06-18 19:48:09,131 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000112 2018-06-18 19:48:09,132 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:48:09,173 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:48:09,173 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000112 2018-06-18 19:48:09,174 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000112 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:48:09,174 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000112 2018-06-18 19:48:09,174 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:48:09,186 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:10,197 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:11,198 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:11,434 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000112] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000112),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:48:12,201 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:12,453 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000112 buffer server: laptop-name:35453 2018-06-18 19:48:12,748 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000112),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:48:13,204 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:14,207 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:14,666 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 109 2018-06-18 19:48:14,667 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:48:15,208 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000112 2018-06-18 19:48:15,209 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000112 2018-06-18 19:48:15,210 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:48:15,213 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:16,215 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000112, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:48:16,215 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000112@localhost:40317 2018-06-18 19:48:16,215 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:48:16,250 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:17,250 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:48:17,250 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:48:17,254 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:18,259 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000113, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority111 2018-06-18 19:48:18,259 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000112 2018-06-18 19:48:18,260 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000113 2018-06-18 19:48:18,260 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:48:18,300 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:48:18,300 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000113 2018-06-18 19:48:18,300 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000113 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:48:18,301 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000113 2018-06-18 19:48:18,301 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:48:18,302 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:19,304 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:20,305 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:20,362 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000113] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000113),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:48:21,308 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:21,381 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000113 buffer server: laptop-name:45449 2018-06-18 19:48:21,622 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000113),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:48:22,309 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:23,312 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:23,550 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 110 2018-06-18 19:48:23,550 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:48:24,313 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000113 2018-06-18 19:48:24,313 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000113 2018-06-18 19:48:24,314 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:48:24,315 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:25,317 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000113, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:48:25,317 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000113@localhost:40317 2018-06-18 19:48:25,317 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:48:25,351 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:26,352 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:48:26,352 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:48:26,355 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:27,360 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000114, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority112 2018-06-18 19:48:27,360 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000113 2018-06-18 19:48:27,361 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000114 2018-06-18 19:48:27,362 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:48:27,410 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:48:27,410 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000114 2018-06-18 19:48:27,411 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000114 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:48:27,411 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000114 2018-06-18 19:48:27,411 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:48:27,413 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:28,414 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:29,415 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:29,483 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000114] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000114),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:48:30,418 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:30,501 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000114 buffer server: laptop-name:35137 2018-06-18 19:48:30,748 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000114),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:48:31,419 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:32,422 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:32,671 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 111 2018-06-18 19:48:32,672 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:48:33,423 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000114 2018-06-18 19:48:33,423 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000114 2018-06-18 19:48:33,425 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:48:33,426 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:34,428 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000114, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:48:34,428 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000114@localhost:40317 2018-06-18 19:48:34,429 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:48:34,464 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:35,464 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:48:35,464 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:48:35,468 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:36,473 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000115, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority113 2018-06-18 19:48:36,473 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000114 2018-06-18 19:48:36,475 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000115 2018-06-18 19:48:36,475 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:48:36,508 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:48:36,508 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000115 2018-06-18 19:48:36,508 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000115 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:48:36,508 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000115 2018-06-18 19:48:36,509 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:48:36,511 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:37,513 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:38,515 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:38,573 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000115] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000115),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:48:39,517 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:39,593 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000115 buffer server: laptop-name:46411 2018-06-18 19:48:39,893 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000115),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:48:40,520 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:41,523 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:41,796 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 112 2018-06-18 19:48:41,797 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:48:42,524 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000115 2018-06-18 19:48:42,524 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000115 2018-06-18 19:48:42,525 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:48:42,527 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:43,529 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000115, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:48:43,529 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000115@localhost:40317 2018-06-18 19:48:43,529 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:48:43,563 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:44,563 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:48:44,564 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:48:44,567 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:45,572 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000116, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority114 2018-06-18 19:48:45,572 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000115 2018-06-18 19:48:45,573 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000116 2018-06-18 19:48:45,573 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:48:45,614 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:48:45,614 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000116 2018-06-18 19:48:45,614 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000116 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:48:45,614 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000116 2018-06-18 19:48:45,615 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:48:45,616 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:46,619 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:47,620 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:47,658 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000116] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000116),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:48:48,623 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:48,677 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000116 buffer server: laptop-name:37021 2018-06-18 19:48:48,909 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000116),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:48:49,626 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:50,630 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:50,820 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 113 2018-06-18 19:48:50,822 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:48:51,630 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000116 2018-06-18 19:48:51,631 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000116 2018-06-18 19:48:51,631 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:48:51,632 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:52,634 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000116, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:48:52,634 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000116@localhost:40317 2018-06-18 19:48:52,634 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:48:52,683 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:53,684 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:48:53,684 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:48:53,687 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:54,690 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000117, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority115 2018-06-18 19:48:54,690 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000116 2018-06-18 19:48:54,691 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000117 2018-06-18 19:48:54,692 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:48:54,718 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:48:54,718 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000117 2018-06-18 19:48:54,718 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000117 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:48:54,718 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000117 2018-06-18 19:48:54,718 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:48:54,719 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:55,722 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:56,723 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:57,687 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000117] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000117),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:48:57,724 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:58,699 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000117 buffer server: laptop-name:46405 2018-06-18 19:48:58,726 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:48:59,046 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000117),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:48:59,727 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:00,731 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:00,939 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 114 2018-06-18 19:49:00,939 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:49:01,732 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000117 2018-06-18 19:49:01,732 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000117 2018-06-18 19:49:01,733 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:49:01,735 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:02,737 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000117, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:49:02,737 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000117@localhost:40317 2018-06-18 19:49:02,737 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:49:02,774 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:03,774 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:49:03,774 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:49:03,777 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:04,782 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000118, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority116 2018-06-18 19:49:04,783 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000117 2018-06-18 19:49:04,784 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000118 2018-06-18 19:49:04,784 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:49:04,824 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:49:04,824 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000118 2018-06-18 19:49:04,824 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000118 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:49:04,825 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000118 2018-06-18 19:49:04,825 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:49:04,826 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:05,828 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:06,829 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:07,165 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000118] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000118),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:49:07,832 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:08,185 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000118 buffer server: laptop-name:46121 2018-06-18 19:49:08,471 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000118),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:49:08,834 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:09,838 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:10,385 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 115 2018-06-18 19:49:10,385 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:49:10,838 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000118 2018-06-18 19:49:10,838 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000118 2018-06-18 19:49:10,840 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:49:10,841 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:11,842 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000118, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:49:11,842 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000118@localhost:40317 2018-06-18 19:49:11,842 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:49:11,860 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:12,861 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:49:12,861 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:49:12,862 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:13,866 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000119, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority117 2018-06-18 19:49:13,867 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000118 2018-06-18 19:49:13,868 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000119 2018-06-18 19:49:13,869 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:49:13,923 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:49:13,923 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000119 2018-06-18 19:49:13,923 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000119 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:49:13,924 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000119 2018-06-18 19:49:13,924 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:49:13,927 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:14,932 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:15,934 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:16,333 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000119] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000119),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:49:16,937 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:17,347 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000119 buffer server: laptop-name:45669 2018-06-18 19:49:17,777 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000119),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:49:17,948 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:18,952 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:19,633 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 116 2018-06-18 19:49:19,633 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:49:19,952 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000119 2018-06-18 19:49:19,952 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000119 2018-06-18 19:49:19,953 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:49:19,954 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:20,955 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000119, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:49:20,955 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000119@localhost:40317 2018-06-18 19:49:20,955 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:49:20,983 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:21,983 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:49:21,983 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:49:21,986 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:22,992 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000120, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority118 2018-06-18 19:49:22,992 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000119 2018-06-18 19:49:22,993 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000120 2018-06-18 19:49:22,993 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:49:23,040 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:49:23,040 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000120 2018-06-18 19:49:23,040 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000120 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:49:23,041 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000120 2018-06-18 19:49:23,041 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:49:23,042 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:24,043 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:25,045 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:25,305 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000120] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000120),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:49:26,048 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:26,324 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000120 buffer server: laptop-name:37023 2018-06-18 19:49:26,542 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000120),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:49:27,051 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:28,054 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:28,468 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 117 2018-06-18 19:49:28,469 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:49:29,055 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000120 2018-06-18 19:49:29,055 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000120 2018-06-18 19:49:29,056 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:49:29,057 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:30,059 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000120, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:49:30,059 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000120@localhost:40317 2018-06-18 19:49:30,060 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:49:30,113 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:31,114 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:49:31,114 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:49:31,117 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:32,122 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000121, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority119 2018-06-18 19:49:32,122 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000120 2018-06-18 19:49:32,123 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000121 2018-06-18 19:49:32,123 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:49:32,164 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:49:32,164 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000121 2018-06-18 19:49:32,164 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000121 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:49:32,165 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000121 2018-06-18 19:49:32,165 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:49:32,166 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:33,174 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:34,175 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:34,293 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000121] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000121),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:49:35,178 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:35,314 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000121 buffer server: laptop-name:39989 2018-06-18 19:49:35,567 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000121),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:49:36,180 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:37,182 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:37,486 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 118 2018-06-18 19:49:37,487 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:49:38,183 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000121 2018-06-18 19:49:38,183 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000121 2018-06-18 19:49:38,185 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:49:38,187 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:39,189 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000121, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:49:39,189 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000121@localhost:40317 2018-06-18 19:49:39,190 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:49:39,231 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:40,231 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:49:40,232 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:49:40,235 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:41,237 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000122, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority120 2018-06-18 19:49:41,237 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000121 2018-06-18 19:49:41,238 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000122 2018-06-18 19:49:41,238 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:49:41,257 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:49:41,257 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000122 2018-06-18 19:49:41,257 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000122 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:49:41,258 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000122 2018-06-18 19:49:41,258 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:49:41,258 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:42,260 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:43,262 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:43,512 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000122] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000122),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:49:44,265 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:44,524 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000122 buffer server: laptop-name:35815 2018-06-18 19:49:44,905 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000122),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:49:45,268 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:46,270 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:46,816 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 119 2018-06-18 19:49:46,817 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:49:47,271 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000122 2018-06-18 19:49:47,271 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000122 2018-06-18 19:49:47,272 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:49:47,274 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:48,276 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000122, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:49:48,276 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000122@localhost:40317 2018-06-18 19:49:48,276 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:49:48,308 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:49,308 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:49:49,308 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:49:49,310 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:50,315 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000123, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority121 2018-06-18 19:49:50,315 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000122 2018-06-18 19:49:50,316 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000123 2018-06-18 19:49:50,316 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:49:50,359 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:49:50,359 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000123 2018-06-18 19:49:50,359 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000123 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:49:50,359 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000123 2018-06-18 19:49:50,360 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:49:50,361 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:51,364 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:52,368 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:52,447 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000123] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000123),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:49:53,371 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:53,467 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000123 buffer server: laptop-name:45675 2018-06-18 19:49:53,736 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000123),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:49:54,375 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:55,377 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:55,647 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 120 2018-06-18 19:49:55,647 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:49:56,377 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000123 2018-06-18 19:49:56,377 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000123 2018-06-18 19:49:56,378 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:49:56,379 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:57,381 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000123, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:49:57,381 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000123@localhost:40317 2018-06-18 19:49:57,382 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:49:57,417 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:58,418 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:49:58,418 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:49:58,421 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:49:59,425 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000124, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority122 2018-06-18 19:49:59,425 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000123 2018-06-18 19:49:59,426 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000124 2018-06-18 19:49:59,426 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:49:59,477 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:49:59,478 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000124 2018-06-18 19:49:59,478 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000124 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:49:59,478 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000124 2018-06-18 19:49:59,479 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:49:59,489 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:00,492 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:01,494 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:01,868 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000124] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000124),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:50:02,497 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:02,888 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000124 buffer server: laptop-name:43191 2018-06-18 19:50:03,177 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000124),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:50:03,499 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:04,501 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:05,103 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 121 2018-06-18 19:50:05,104 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:50:05,502 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000124 2018-06-18 19:50:05,502 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000124 2018-06-18 19:50:05,503 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:50:05,510 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:06,512 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000124, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:50:06,513 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000124@localhost:40317 2018-06-18 19:50:06,513 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:50:06,548 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:07,549 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:50:07,549 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:50:07,553 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:08,558 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000125, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority123 2018-06-18 19:50:08,558 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000124 2018-06-18 19:50:08,559 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000125 2018-06-18 19:50:08,560 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:50:08,604 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:50:08,605 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000125 2018-06-18 19:50:08,605 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000125 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:50:08,605 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000125 2018-06-18 19:50:08,605 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:50:08,607 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:09,611 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:10,612 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:10,802 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000125] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000125),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:50:11,614 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:11,822 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000125 buffer server: laptop-name:33223 2018-06-18 19:50:12,101 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000125),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:50:12,616 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:13,619 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:14,012 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 122 2018-06-18 19:50:14,012 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:50:14,620 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000125 2018-06-18 19:50:14,620 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000125 2018-06-18 19:50:14,622 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:50:14,624 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:15,626 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000125, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:50:15,626 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000125@localhost:40317 2018-06-18 19:50:15,626 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:50:15,677 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:16,677 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:50:16,677 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:50:16,682 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:17,686 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000126, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority124 2018-06-18 19:50:17,686 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000125 2018-06-18 19:50:17,687 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000126 2018-06-18 19:50:17,688 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:50:17,724 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:50:17,725 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000126 2018-06-18 19:50:17,725 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000126 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:50:17,725 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000126 2018-06-18 19:50:17,725 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:50:17,729 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:18,731 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:19,733 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:19,865 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000126] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000126),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:50:20,735 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:20,884 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000126 buffer server: laptop-name:42919 2018-06-18 19:50:21,153 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000126),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:50:21,738 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:22,742 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:23,078 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 123 2018-06-18 19:50:23,078 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:50:23,742 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000126 2018-06-18 19:50:23,743 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000126 2018-06-18 19:50:23,745 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:50:23,747 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:24,749 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000126, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:50:24,749 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000126@localhost:40317 2018-06-18 19:50:24,749 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:50:24,783 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:25,784 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:50:25,784 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:50:25,787 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:26,792 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000127, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority125 2018-06-18 19:50:26,793 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000126 2018-06-18 19:50:26,794 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000127 2018-06-18 19:50:26,794 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:50:26,837 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:50:26,838 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000127 2018-06-18 19:50:26,838 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000127 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:50:26,838 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000127 2018-06-18 19:50:26,839 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:50:26,840 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:27,842 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:28,844 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:28,955 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000127] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000127),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:50:29,847 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:29,975 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000127 buffer server: laptop-name:33551 2018-06-18 19:50:30,195 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000127),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:50:30,850 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:31,853 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:32,119 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 124 2018-06-18 19:50:32,120 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:50:32,854 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000127 2018-06-18 19:50:32,854 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000127 2018-06-18 19:50:32,856 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:50:32,857 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:33,860 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000127, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:50:33,860 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000127@localhost:40317 2018-06-18 19:50:33,860 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:50:33,897 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:34,898 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:50:34,898 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:50:34,901 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:35,907 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000128, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority126 2018-06-18 19:50:35,907 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000127 2018-06-18 19:50:35,909 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000128 2018-06-18 19:50:35,909 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:50:35,954 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:50:35,954 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000128 2018-06-18 19:50:35,954 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000128 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:50:35,954 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000128 2018-06-18 19:50:35,954 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:50:35,955 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:36,959 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:37,960 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:38,029 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000128] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000128),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:50:38,963 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:39,048 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000128 buffer server: laptop-name:40053 2018-06-18 19:50:39,307 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000128),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:50:39,965 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:40,968 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:41,219 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 125 2018-06-18 19:50:41,220 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:50:41,968 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000128 2018-06-18 19:50:41,968 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000128 2018-06-18 19:50:41,970 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:50:41,973 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:42,976 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000128, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:50:42,976 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000128@localhost:40317 2018-06-18 19:50:42,976 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:50:43,011 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:44,012 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:50:44,012 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:50:44,016 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:45,021 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000129, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority127 2018-06-18 19:50:45,022 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000128 2018-06-18 19:50:45,023 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000129 2018-06-18 19:50:45,023 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:50:45,063 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:50:45,064 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000129 2018-06-18 19:50:45,064 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000129 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:50:45,064 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000129 2018-06-18 19:50:45,064 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:50:45,066 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:46,068 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:47,070 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:47,072 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000129] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000129),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:50:48,073 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:48,091 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000129 buffer server: laptop-name:35615 2018-06-18 19:50:48,338 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000129),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:50:49,075 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:50,077 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:50,267 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 126 2018-06-18 19:50:50,268 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:50:51,078 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000129 2018-06-18 19:50:51,078 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000129 2018-06-18 19:50:51,079 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:50:51,081 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:52,084 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000129, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:50:52,084 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000129@localhost:40317 2018-06-18 19:50:52,084 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:50:52,117 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:53,118 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:50:53,118 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:50:53,122 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:54,128 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000130, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority128 2018-06-18 19:50:54,128 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000129 2018-06-18 19:50:54,130 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000130 2018-06-18 19:50:54,130 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:50:54,170 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:50:54,170 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000130 2018-06-18 19:50:54,170 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000130 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:50:54,171 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000130 2018-06-18 19:50:54,171 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:50:54,172 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:55,174 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:56,175 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:56,227 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000130] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000130),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:50:57,178 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:57,253 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000130 buffer server: laptop-name:43347 2018-06-18 19:50:57,527 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000130),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:50:58,181 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:59,184 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:50:59,444 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 127 2018-06-18 19:50:59,445 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:51:00,185 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000130 2018-06-18 19:51:00,185 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000130 2018-06-18 19:51:00,187 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:51:00,189 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:01,192 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000130, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:51:01,192 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000130@localhost:40317 2018-06-18 19:51:01,192 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:51:01,229 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:02,229 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:51:02,230 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:51:02,234 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:03,238 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000131, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority129 2018-06-18 19:51:03,238 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000130 2018-06-18 19:51:03,240 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000131 2018-06-18 19:51:03,240 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:51:03,286 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:51:03,286 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000131 2018-06-18 19:51:03,286 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000131 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:51:03,286 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000131 2018-06-18 19:51:03,286 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:51:03,287 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:04,289 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:05,291 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:05,504 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000131] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000131),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:51:06,294 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:06,523 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000131 buffer server: laptop-name:38383 2018-06-18 19:51:06,804 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000131),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:51:07,297 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:08,299 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:08,722 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 128 2018-06-18 19:51:08,723 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:51:09,300 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000131 2018-06-18 19:51:09,300 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000131 2018-06-18 19:51:09,302 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:51:09,303 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:10,305 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000131, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:51:10,305 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000131@localhost:40317 2018-06-18 19:51:10,306 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:51:10,339 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:11,339 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:51:11,339 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:51:11,342 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:12,348 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000132, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority130 2018-06-18 19:51:12,348 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000131 2018-06-18 19:51:12,350 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000132 2018-06-18 19:51:12,350 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:51:12,391 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:51:12,391 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000132 2018-06-18 19:51:12,391 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000132 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:51:12,391 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000132 2018-06-18 19:51:12,391 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:51:12,392 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:13,394 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:14,395 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:14,491 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000132] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000132),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:51:15,398 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:15,510 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000132 buffer server: laptop-name:41935 2018-06-18 19:51:15,734 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000132),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:51:16,401 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:17,404 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:17,674 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 129 2018-06-18 19:51:17,674 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:51:18,405 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000132 2018-06-18 19:51:18,405 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000132 2018-06-18 19:51:18,406 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:51:18,408 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:19,411 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000132, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:51:19,411 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000132@localhost:40317 2018-06-18 19:51:19,411 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:51:19,446 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:20,447 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:51:20,447 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:51:20,450 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:21,455 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000133, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority131 2018-06-18 19:51:21,456 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000132 2018-06-18 19:51:21,457 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000133 2018-06-18 19:51:21,457 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:51:21,500 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:51:21,500 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000133 2018-06-18 19:51:21,500 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000133 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:51:21,501 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000133 2018-06-18 19:51:21,501 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:51:21,503 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:22,505 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:23,506 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:23,538 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000133] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000133),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:51:24,509 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:24,558 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000133 buffer server: laptop-name:43987 2018-06-18 19:51:24,822 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000133),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:51:25,513 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:26,514 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:26,761 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 130 2018-06-18 19:51:26,761 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:51:27,515 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000133 2018-06-18 19:51:27,515 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000133 2018-06-18 19:51:27,517 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:51:27,519 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:28,521 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000133, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:51:28,522 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000133@localhost:40317 2018-06-18 19:51:28,522 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:51:28,559 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:29,560 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:51:29,560 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:51:29,563 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:30,569 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000134, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority132 2018-06-18 19:51:30,569 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000133 2018-06-18 19:51:30,571 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000134 2018-06-18 19:51:30,571 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:51:30,613 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:51:30,613 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000134 2018-06-18 19:51:30,613 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000134 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:51:30,614 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000134 2018-06-18 19:51:30,614 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:51:30,615 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:31,617 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:32,619 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:32,699 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000134] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000134),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:51:33,622 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:33,718 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000134 buffer server: laptop-name:37511 2018-06-18 19:51:33,974 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000134),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:51:34,625 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:35,629 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:35,899 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 131 2018-06-18 19:51:35,900 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:51:36,629 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000134 2018-06-18 19:51:36,630 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000134 2018-06-18 19:51:36,631 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:51:36,633 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:37,657 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000134, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:51:37,658 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000134@localhost:40317 2018-06-18 19:51:37,658 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:51:37,695 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:38,696 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:51:38,696 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:51:38,699 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:39,704 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000135, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority133 2018-06-18 19:51:39,704 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000134 2018-06-18 19:51:39,706 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000135 2018-06-18 19:51:39,706 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:51:39,751 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:51:39,752 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000135 2018-06-18 19:51:39,752 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000135 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:51:39,752 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000135 2018-06-18 19:51:39,752 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:51:39,756 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:40,760 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:41,761 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:41,928 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000135] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000135),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:51:42,765 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:42,947 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000135 buffer server: laptop-name:45501 2018-06-18 19:51:43,261 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000135),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:51:43,768 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:44,771 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:45,167 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 132 2018-06-18 19:51:45,168 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:51:45,771 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000135 2018-06-18 19:51:45,771 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000135 2018-06-18 19:51:45,773 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:51:45,774 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:46,778 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000135, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:51:46,778 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000135@localhost:40317 2018-06-18 19:51:46,778 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:51:46,816 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:47,816 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:51:47,816 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:51:47,820 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:48,825 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000136, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority134 2018-06-18 19:51:48,825 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000135 2018-06-18 19:51:48,826 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000136 2018-06-18 19:51:48,826 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:51:48,870 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:51:48,870 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000136 2018-06-18 19:51:48,870 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000136 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:51:48,870 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000136 2018-06-18 19:51:48,870 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:51:48,872 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:49,874 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:50,876 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:51,044 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000136] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000136),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:51:51,879 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:52,063 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000136 buffer server: laptop-name:45059 2018-06-18 19:51:52,325 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000136),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:51:52,882 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:53,885 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:54,248 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 133 2018-06-18 19:51:54,249 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:51:54,885 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000136 2018-06-18 19:51:54,885 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000136 2018-06-18 19:51:54,887 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:51:54,888 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:55,890 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000136, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:51:55,891 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000136@localhost:40317 2018-06-18 19:51:55,891 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:51:55,927 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:56,928 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:51:56,928 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:51:56,931 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:57,937 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000137, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority135 2018-06-18 19:51:57,937 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000136 2018-06-18 19:51:57,938 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000137 2018-06-18 19:51:57,939 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:51:57,978 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:51:57,978 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000137 2018-06-18 19:51:57,979 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000137 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:51:57,979 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000137 2018-06-18 19:51:57,979 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:51:57,981 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:58,983 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:51:59,985 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:00,009 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000137] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000137),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:52:00,988 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:01,032 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000137 buffer server: laptop-name:42923 2018-06-18 19:52:01,386 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000137),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:52:01,991 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:02,995 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:03,237 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 134 2018-06-18 19:52:03,238 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:52:03,995 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000137 2018-06-18 19:52:03,996 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000137 2018-06-18 19:52:03,996 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:52:03,998 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:05,000 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000137, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:52:05,000 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000137@localhost:40317 2018-06-18 19:52:05,000 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:52:05,050 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:06,051 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:52:06,051 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:52:06,056 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:07,060 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000138, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority136 2018-06-18 19:52:07,060 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000137 2018-06-18 19:52:07,061 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000138 2018-06-18 19:52:07,061 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:52:07,110 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:52:07,111 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000138 2018-06-18 19:52:07,111 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000138 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:52:07,112 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000138 2018-06-18 19:52:07,112 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:52:07,113 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:08,119 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:09,121 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:09,608 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000138] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000138),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:52:10,122 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:10,626 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000138 buffer server: laptop-name:40339 2018-06-18 19:52:10,967 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000138),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:52:11,123 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:12,127 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:12,836 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 135 2018-06-18 19:52:12,837 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:52:13,128 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000138 2018-06-18 19:52:13,128 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000138 2018-06-18 19:52:13,129 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:52:13,132 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:14,133 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000138, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:52:14,133 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000138@localhost:40317 2018-06-18 19:52:14,133 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:52:14,156 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:15,156 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:52:15,157 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:52:15,160 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:16,165 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000139, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority137 2018-06-18 19:52:16,166 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000138 2018-06-18 19:52:16,167 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000139 2018-06-18 19:52:16,167 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:52:16,213 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:52:16,213 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000139 2018-06-18 19:52:16,213 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000139 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:52:16,213 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000139 2018-06-18 19:52:16,213 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:52:16,215 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:17,217 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:18,221 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:18,586 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000139] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000139),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:52:19,224 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:19,606 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000139 buffer server: laptop-name:35401 2018-06-18 19:52:19,856 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000139),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:52:20,226 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:21,228 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:21,776 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 136 2018-06-18 19:52:21,776 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:52:22,229 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000139 2018-06-18 19:52:22,229 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000139 2018-06-18 19:52:22,230 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:52:22,233 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:23,235 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000139, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:52:23,235 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000139@localhost:40317 2018-06-18 19:52:23,235 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:52:23,291 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:24,291 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:52:24,291 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:52:24,295 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:25,301 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000140, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority138 2018-06-18 19:52:25,301 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000139 2018-06-18 19:52:25,302 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000140 2018-06-18 19:52:25,302 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:52:25,349 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:52:25,349 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000140 2018-06-18 19:52:25,349 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000140 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:52:25,349 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000140 2018-06-18 19:52:25,350 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:52:25,354 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:26,358 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:27,362 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:27,956 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000140] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000140),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:52:28,364 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:28,968 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000140 buffer server: laptop-name:41325 2018-06-18 19:52:29,290 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000140),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:52:29,366 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:30,369 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:31,157 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 137 2018-06-18 19:52:31,158 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:52:31,369 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000140 2018-06-18 19:52:31,369 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000140 2018-06-18 19:52:31,371 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:52:31,372 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:32,374 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000140, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:52:32,375 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000140@localhost:40317 2018-06-18 19:52:32,375 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:52:32,408 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:33,409 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:52:33,409 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:52:33,412 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:34,417 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000141, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority139 2018-06-18 19:52:34,418 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000140 2018-06-18 19:52:34,419 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000141 2018-06-18 19:52:34,419 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:52:34,460 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:52:34,460 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000141 2018-06-18 19:52:34,460 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000141 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:52:34,461 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000141 2018-06-18 19:52:34,461 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:52:34,463 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:35,465 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:36,466 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:36,560 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000141] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000141),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:52:37,469 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:37,580 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000141 buffer server: laptop-name:38057 2018-06-18 19:52:37,828 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000141),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:52:38,473 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:39,476 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:39,749 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 138 2018-06-18 19:52:39,750 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:52:40,477 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000141 2018-06-18 19:52:40,477 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000141 2018-06-18 19:52:40,478 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:52:40,480 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:41,482 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000141, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:52:41,482 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000141@localhost:40317 2018-06-18 19:52:41,482 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:52:41,555 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:42,555 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:52:42,555 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:52:42,559 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:43,563 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000142, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority140 2018-06-18 19:52:43,564 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000141 2018-06-18 19:52:43,565 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000142 2018-06-18 19:52:43,565 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:52:43,613 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:52:43,613 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000142 2018-06-18 19:52:43,614 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000142 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:52:43,614 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000142 2018-06-18 19:52:43,614 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:52:43,616 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:44,627 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:45,629 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:46,085 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000142] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000142),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:52:46,632 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:47,104 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000142 buffer server: laptop-name:39235 2018-06-18 19:52:47,346 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000142),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:52:47,633 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:48,636 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:49,266 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 139 2018-06-18 19:52:49,266 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:52:49,636 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000142 2018-06-18 19:52:49,637 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000142 2018-06-18 19:52:49,637 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:52:49,638 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:50,640 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000142, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:52:50,640 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000142@localhost:40317 2018-06-18 19:52:50,640 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:52:50,679 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:51,679 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:52:51,679 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:52:51,682 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:52,687 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000143, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority141 2018-06-18 19:52:52,687 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000142 2018-06-18 19:52:52,688 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000143 2018-06-18 19:52:52,688 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:52:52,712 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:52:52,712 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000143 2018-06-18 19:52:52,712 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000143 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:52:52,712 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000143 2018-06-18 19:52:52,712 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:52:52,713 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:53,715 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:54,717 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:54,775 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000143] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000143),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:52:55,719 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:55,786 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000143 buffer server: laptop-name:34799 2018-06-18 19:52:56,122 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000143),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:52:56,720 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:57,724 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:58,017 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 140 2018-06-18 19:52:58,018 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:52:58,725 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000143 2018-06-18 19:52:58,725 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000143 2018-06-18 19:52:58,726 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:52:58,728 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:52:59,730 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000143, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:52:59,730 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000143@localhost:40317 2018-06-18 19:52:59,730 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:52:59,777 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:53:00,778 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:53:00,778 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:53:00,781 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:53:01,786 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000144, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority142 2018-06-18 19:53:01,786 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000143 2018-06-18 19:53:01,788 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000144 2018-06-18 19:53:01,788 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:53:01,830 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:53:01,830 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000144 2018-06-18 19:53:01,831 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000144 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:53:01,831 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000144 2018-06-18 19:53:01,831 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:53:01,833 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:53:02,835 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:53:03,837 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:53:04,118 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000144] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000144),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:53:04,840 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:53:05,137 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000144 buffer server: laptop-name:40071 2018-06-18 19:53:05,407 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000144),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:53:05,842 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:53:06,845 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:53:07,331 WARN com.datatorrent.stram.StreamingContainerManager: Operator failure: PTOperator[id=2,name=Aggregator,state=INACTIVE] count: 141 2018-06-18 19:53:07,332 ERROR com.datatorrent.stram.StreamingContainerManager: Initiating container restart after operator failure PTOperator[id=2,name=Aggregator,state=INACTIVE] 2018-06-18 19:53:07,845 INFO com.datatorrent.stram.StreamingAppMasterService: Requested stop container container_1529349239295_0005_01_000144 2018-06-18 19:53:07,845 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: STOP_CONTAINER for Container container_1529349239295_0005_01_000144 2018-06-18 19:53:07,847 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:53:07,849 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:53:08,851 INFO com.datatorrent.stram.StreamingAppMasterService: Completed containerId=container_1529349239295_0005_01_000144, state=COMPLETE, exitStatus=-105, diagnostics=Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 2018-06-18 19:53:08,851 INFO com.datatorrent.stram.StreamingContainerManager: Initiating recovery for container_1529349239295_0005_01_000144@localhost:40317 2018-06-18 19:53:08,851 INFO com.datatorrent.stram.StreamingContainerManager: Affected operators [PTOperator[id=2,name=Aggregator,state=INACTIVE], PTOperator[id=3,name=FileOutput,state=ACTIVE]] 2018-06-18 19:53:08,887 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:53:09,887 INFO com.datatorrent.stram.ResourceRequestHandler: Strict anti-affinity = [] for container with operators PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY] 2018-06-18 19:53:09,887 INFO com.datatorrent.stram.ResourceRequestHandler: Found host null 2018-06-18 19:53:09,891 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:53:10,893 INFO com.datatorrent.stram.StreamingAppMasterService: Got new container., containerId=container_1529349239295_0005_01_000145, containerNode=localhost:40317, containerNodeURI=localhost:8042, containerResourceMemory2048, priority143 2018-06-18 19:53:10,893 INFO com.datatorrent.stram.StreamingContainerManager: Removing container agent container_1529349239295_0005_01_000144 2018-06-18 19:53:10,894 INFO com.datatorrent.stram.LaunchContainerRunnable: Setting up container launch context for containerid=container_1529349239295_0005_01_000145 2018-06-18 19:53:10,894 INFO com.datatorrent.stram.LaunchContainerRunnable: CLASSPATH: ./*:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*:$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:. 2018-06-18 19:53:10,905 INFO com.datatorrent.common.util.BasicContainerOptConfigurator: property map for operator {-Xmx=768m, Generic=null} 2018-06-18 19:53:10,905 INFO com.datatorrent.stram.LaunchContainerRunnable: Jvm opts -Xmx1342177280 for container container_1529349239295_0005_01_000145 2018-06-18 19:53:10,905 INFO com.datatorrent.stram.LaunchContainerRunnable: Launching on node: localhost:40317 command: $JAVA_HOME/bin/java -Xmx1342177280 -Ddt.attr.APPLICATION_PATH=hdfs://localhost:8020/user/apex/datatorrent/apps/application_1529349239295_0005 -Djava.io.tmpdir=$PWD/tmp -Ddt.cid=container_1529349239295_0005_01_000145 -Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir= -Dapex.application.name=$'MyFirstApplication' com.datatorrent.stram.engine.StreamingContainer 1>/stdout 2>/stderr 2018-06-18 19:53:10,905 INFO org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Processing Event EventType: START_CONTAINER for Container container_1529349239295_0005_01_000145 2018-06-18 19:53:10,905 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : localhost:40317 2018-06-18 19:53:10,906 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:53:11,908 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:53:12,910 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:53:13,167 INFO com.datatorrent.stram.StreamingContainerParent: child msg: [container_1529349239295_0005_01_000145] Entering heartbeat loop.. context: PTContainer[id=2(container_1529349239295_0005_01_000145),state=ALLOCATED,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] 2018-06-18 19:53:13,913 WARN com.datatorrent.stram.StreamingContainerManager: Some operators are behind for more than 1000 windows! Trimming the end window stats map 2018-06-18 19:53:14,186 INFO com.datatorrent.stram.StreamingContainerManager: Container container_1529349239295_0005_01_000145 buffer server: laptop-name:34857 2018-06-18 19:53:14,427 INFO com.datatorrent.stram.StreamingContainerParent: child msg: Stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) context: PTContainer[id=2(container_1529349239295_0005_01_000145),state=ACTIVE,operators=[PTOperator[id=2,name=Aggregator,state=PENDING_DEPLOY]]] End of LogType:apex.log Container: container_1529349239295_0005_01_000117 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:48:55,872 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:48:57,658 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000117/tmp as the basepath for spooling. 2018-06-18 19:48:57,663 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:46405 2018-06-18 19:48:58,743 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:48:58,883 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:48:58,914 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:46405/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:48:59,005 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000117/tmp/chkp4923910233464740519 as the basepath for checkpointing. 2018-06-18 19:48:59,014 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:48:59,147 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@64614d64 for node 2 2018-06-18 19:48:59,182 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:48:59,182 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746075_5251 2018-06-18 19:48:59,184 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:48:59,185 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:49:00,944 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:49:00,947 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:49:00,952 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@18e325e7identifier=tcp://laptop-name:46405/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6046d641{da=com.datatorrent.bufferserver.internal.DataList$Block@25d6bd28{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=572, starting_window=5b28089000000001, ending_window=5b28089000000008, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@192c3a25[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000084 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20915 Log Contents: 2018-06-18 19:43:52,958 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:43:54,183 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000084/tmp as the basepath for spooling. 2018-06-18 19:43:54,188 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:43111 2018-06-18 19:43:55,264 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:43:55,353 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:43:55,422 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000084/tmp/chkp4341310677579808419 as the basepath for checkpointing. 2018-06-18 19:43:55,437 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:43:55,456 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:43111/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:43:55,557 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@3f4b87c1 for node 2 2018-06-18 19:43:55,673 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:43:55,673 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:43:57,395 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:43:57,397 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:43:57,402 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@51934f6aidentifier=tcp://laptop-name:43111/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@68299525{da=com.datatorrent.bufferserver.internal.DataList$Block@6ac85f3{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=2141, starting_window=5b28089000000001, ending_window=5b28089000000012, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@71b79da7[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000051 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20916 Log Contents: 2018-06-18 19:38:52,075 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:38:53,229 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000051/tmp as the basepath for spooling. 2018-06-18 19:38:53,233 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:37559 2018-06-18 19:38:54,324 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:38:54,397 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:37559/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:38:54,450 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:38:54,512 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000051/tmp/chkp2406745200389890771 as the basepath for checkpointing. 2018-06-18 19:38:54,530 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:38:54,640 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@4d0ecdae for node 2 2018-06-18 19:38:54,744 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:38:54,744 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:38:56,483 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:38:56,486 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:38:56,490 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@747032f3identifier=tcp://laptop-name:37559/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7fb0a429{da=com.datatorrent.bufferserver.internal.DataList$Block@580a16bd{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=3266, starting_window=5b28089000000001, ending_window=5b28089000000017, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@65bb70a6[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000018 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23943 Log Contents: 2018-06-18 19:33:51,163 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:33:52,320 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000018/tmp as the basepath for spooling. 2018-06-18 19:33:52,323 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:33185 2018-06-18 19:33:53,414 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:33:53,506 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:33:53,588 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000018/tmp/chkp2345871801554276620 as the basepath for checkpointing. 2018-06-18 19:33:53,594 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:33:53,626 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:33185/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:33:53,713 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@4373019 for node 2 2018-06-18 19:33:53,719 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:33:53,720 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744178_3354 2018-06-18 19:33:53,722 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:33:53,723 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:33:55,541 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:33:55,543 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:33:55,549 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@c311cb8identifier=tcp://laptop-name:33185/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5ebdb22{da=com.datatorrent.bufferserver.internal.DataList$Block@1a5597d4{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=465, starting_window=5b28089000000001, ending_window=5b28089000000007, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@54a01df1[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000109 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:19475 Log Contents: 2018-06-18 19:47:42,049 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:47:43,949 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000109/tmp as the basepath for spooling. 2018-06-18 19:47:43,954 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:41459 2018-06-18 19:47:45,069 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:47:45,190 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:47:45,367 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000109/tmp/chkp3425771496886388826 as the basepath for checkpointing. 2018-06-18 19:47:45,387 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:47:45,511 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@61e83cdd for node 2 2018-06-18 19:47:45,539 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41459/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:47:47,220 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:47:47,222 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:47:47,224 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@51ca9dc4identifier=tcp://laptop-name:41459/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7ac9a29d{da=com.datatorrent.bufferserver.internal.DataList$Block@45555598{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=572, starting_window=5b28089000000001, ending_window=5b28089000000008, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@748c2874[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000076 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:19006 Log Contents: 2018-06-18 19:42:40,042 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:42:41,192 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000076/tmp as the basepath for spooling. 2018-06-18 19:42:41,196 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:37547 2018-06-18 19:42:42,278 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:42:42,364 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:42:42,430 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:42:42,619 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:37547/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:42:44,384 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:42:44,387 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:42:44,392 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@74c3cb9eidentifier=tcp://laptop-name:37547/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6fdd10eb{da=com.datatorrent.bufferserver.internal.DataList$Block@7a3e37cf{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=12, starting_window=5b28089000000001, ending_window=5b28089000000001, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@7b9149c0[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000043 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20916 Log Contents: 2018-06-18 19:37:39,158 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:37:40,355 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000043/tmp as the basepath for spooling. 2018-06-18 19:37:40,359 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:37551 2018-06-18 19:37:41,451 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:37:41,536 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:37551/2.out.1, windowId=5b28089000000016, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:37:41,585 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:37:41,642 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000043/tmp/chkp7353097911990779737 as the basepath for checkpointing. 2018-06-18 19:37:41,671 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:37:41,859 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@1880887d for node 2 2018-06-18 19:37:41,889 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:37:41,890 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:37:43,621 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:37:43,623 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:37:43,628 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@2d6fb16eidentifier=tcp://laptop-name:37551/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4e28d984{da=com.datatorrent.bufferserver.internal.DataList$Block@147ad7c2{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=7613, starting_window=5b28089000000001, ending_window=5b28089000000025, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@32f63e17[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000010 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20916 Log Contents: 2018-06-18 19:32:37,987 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:32:39,136 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000010/tmp as the basepath for spooling. 2018-06-18 19:32:39,139 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:34205 2018-06-18 19:32:40,227 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:32:40,311 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:32:40,381 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000010/tmp/chkp3694599445954351594 as the basepath for checkpointing. 2018-06-18 19:32:40,396 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:32:40,510 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@596be673 for node 2 2018-06-18 19:32:40,607 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:34205/2.out.1, windowId=5b28089000000011, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:32:40,636 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:32:40,637 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:32:42,339 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:32:42,341 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:32:42,346 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@75460ea9identifier=tcp://laptop-name:34205/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7fe18324{da=com.datatorrent.bufferserver.internal.DataList$Block@35818b4f{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=2789, starting_window=5b28089000000001, ending_window=5b28089000000015, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@30d21197[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000142 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20915 Log Contents: 2018-06-18 19:52:44,639 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:52:46,059 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000142/tmp as the basepath for spooling. 2018-06-18 19:52:46,063 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:39235 2018-06-18 19:52:47,157 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:52:47,252 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:52:47,323 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000142/tmp/chkp8400598342457457877 as the basepath for checkpointing. 2018-06-18 19:52:47,327 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:52:47,447 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@72719951 for node 2 2018-06-18 19:52:47,542 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:39235/2.out.1, windowId=5b28089000000032, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:52:47,577 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:52:47,578 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:52:49,270 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:52:49,273 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:52:49,277 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@248c49a0identifier=tcp://laptop-name:39235/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@33ef079c{da=com.datatorrent.bufferserver.internal.DataList$Block@4bd0814c{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=278, starting_window=5b28089000000001, ending_window=5b28089000000005, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@6517ef2b[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000101 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23945 Log Contents: 2018-06-18 19:46:27,851 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:46:29,125 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000101/tmp as the basepath for spooling. 2018-06-18 19:46:29,131 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:38305 2018-06-18 19:46:30,285 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:46:30,387 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:46:30,461 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000101/tmp/chkp7185049574118964840 as the basepath for checkpointing. 2018-06-18 19:46:30,467 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:46:30,586 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@57a41695 for node 2 2018-06-18 19:46:30,586 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:46:30,587 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745763_4939 2018-06-18 19:46:30,589 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:46:30,590 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:46:30,641 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:38305/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:46:32,415 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:46:32,418 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:46:32,423 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@5dfe139aidentifier=tcp://laptop-name:38305/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@72d65ac{da=com.datatorrent.bufferserver.internal.DataList$Block@45a8e759{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=367, starting_window=5b28089000000001, ending_window=5b28089000000006, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@1738fe87[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000068 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20915 Log Contents: 2018-06-18 19:41:27,107 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:41:28,300 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000068/tmp as the basepath for spooling. 2018-06-18 19:41:28,305 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:42427 2018-06-18 19:41:29,407 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:41:29,533 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:41:29,605 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000068/tmp/chkp1798509953269319678 as the basepath for checkpointing. 2018-06-18 19:41:29,624 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:41:29,732 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:42427/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:41:29,736 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@4a5786cc for node 2 2018-06-18 19:41:29,881 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:41:29,882 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:41:31,562 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:41:31,565 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:41:31,569 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@e37fff6identifier=tcp://laptop-name:42427/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@342ac17e{da=com.datatorrent.bufferserver.internal.DataList$Block@58a8ddd7{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=3518, starting_window=5b28089000000001, ending_window=5b28089000000018, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@4fdcb5f8[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000035 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:36:26,246 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:36:27,454 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000035/tmp as the basepath for spooling. 2018-06-18 19:36:27,458 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:46709 2018-06-18 19:36:28,550 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:36:28,612 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:46709/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:36:28,671 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:36:28,733 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000035/tmp/chkp8403709180662772207 as the basepath for checkpointing. 2018-06-18 19:36:28,739 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:36:28,863 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@12b8767d for node 2 2018-06-18 19:36:28,866 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:36:28,868 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744502_3678 2018-06-18 19:36:28,870 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:36:28,871 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:36:30,700 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:36:30,703 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:36:30,709 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6834d09cidentifier=tcp://laptop-name:46709/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@52e1495e{da=com.datatorrent.bufferserver.internal.DataList$Block@1f071fec{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=465, starting_window=5b28089000000001, ending_window=5b28089000000007, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@2a279d3a[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000002 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:24482 Log Contents: 2018-06-18 19:31:39,917 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:31:41,930 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000002/tmp as the basepath for spooling. 2018-06-18 19:31:41,934 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:39089 2018-06-18 19:31:42,996 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:31:43,501 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:31:44,005 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:31:44,514 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:31:45,023 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:31:45,486 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:39089/2.out.1, windowId=ffffffffffffffff, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:31:45,543 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:31:45,634 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:31:45,883 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000002/tmp/chkp3729271174049276295 as the basepath for checkpointing. 2018-06-18 19:31:45,961 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:31:46,109 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@1c645a32 for node 2 2018-06-18 19:31:46,229 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:31:46,421 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073743875_3051 2018-06-18 19:31:46,423 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:31:46,425 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:31:47,680 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:31:47,683 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:31:47,688 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@3f967747identifier=tcp://laptop-name:39089/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@12aa684b{da=com.datatorrent.bufferserver.internal.DataList$Block@1b290941{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1403, starting_window=5b28089000000001, ending_window=5b2808900000000e, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@405c4e89[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000134 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:19006 Log Contents: 2018-06-18 19:51:31,495 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:51:32,676 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000134/tmp as the basepath for spooling. 2018-06-18 19:51:32,681 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:37511 2018-06-18 19:51:33,783 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:51:33,880 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:51:33,947 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:51:34,179 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:37511/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:51:35,903 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:51:35,906 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:51:35,911 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@37595397identifier=tcp://laptop-name:37511/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7d6aab23{da=com.datatorrent.bufferserver.internal.DataList$Block@7b631e54{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=12, starting_window=5b28089000000001, ending_window=5b28089000000001, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@6f72586a[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000093 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:19006 Log Contents: 2018-06-18 19:45:14,955 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:45:16,216 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000093/tmp as the basepath for spooling. 2018-06-18 19:45:16,220 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:33513 2018-06-18 19:45:17,345 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:33513/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:45:17,346 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:45:17,445 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:45:17,498 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:45:19,468 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:45:19,472 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:45:19,477 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@53abe0f7identifier=tcp://laptop-name:33513/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@370068c2{da=com.datatorrent.bufferserver.internal.DataList$Block@53ce9282{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=12, starting_window=5b28089000000001, ending_window=5b28089000000001, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@3029f9ec[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000060 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:19004 Log Contents: 2018-06-18 19:40:14,127 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:40:15,313 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000060/tmp as the basepath for spooling. 2018-06-18 19:40:15,318 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:33775 2018-06-18 19:40:16,393 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:40:16,481 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:40:16,546 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:40:16,856 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:33775/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:40:18,500 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:40:18,502 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:40:18,506 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@53dc9886identifier=tcp://laptop-name:33775/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6484767a{da=com.datatorrent.bufferserver.internal.DataList$Block@7485a6{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=12, starting_window=5b28089000000001, ending_window=5b28089000000001, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@34966eab[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000027 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:19006 Log Contents: 2018-06-18 19:35:13,276 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:35:14,443 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000027/tmp as the basepath for spooling. 2018-06-18 19:35:14,447 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:43109 2018-06-18 19:35:15,520 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:35:15,606 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:35:15,668 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:43109/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:35:15,676 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:35:17,628 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:35:17,631 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:35:17,637 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@177a3f99identifier=tcp://laptop-name:43109/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@57918e12{da=com.datatorrent.bufferserver.internal.DataList$Block@25caadaa{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=12, starting_window=5b28089000000001, ending_window=5b28089000000001, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@1ad8f344[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000126 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22076 Log Contents: 2018-06-18 19:50:18,608 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:50:19,842 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000126/tmp as the basepath for spooling. 2018-06-18 19:50:19,846 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:42919 2018-06-18 19:50:20,938 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:50:21,059 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:50:21,130 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000126/tmp/chkp4905206427411281651 as the basepath for checkpointing. 2018-06-18 19:50:21,133 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:50:21,254 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@50d8d21c for node 2 2018-06-18 19:50:21,351 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:42919/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:50:21,371 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:50:21,372 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:50:21,371 WARN org.apache.hadoop.hdfs.DFSClient: DFSOutputStream ResponseProcessor exception for block BP-2135041833-172.17.0.3-1526202085113:blk_1073746248_5424 java.nio.channels.ClosedChannelException at sun.nio.ch.SocketChannelImpl.ensureReadOpen(SocketChannelImpl.java:257) at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:300) at org.apache.hadoop.net.SocketInputStream$Reader.performIO(SocketInputStream.java:57) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:118) at java.io.FilterInputStream.read(FilterInputStream.java:83) at java.io.FilterInputStream.read(FilterInputStream.java:83) at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:2280) at org.apache.hadoop.hdfs.protocol.datatransfer.PipelineAck.readFields(PipelineAck.java:244) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer$ResponseProcessor.run(DFSOutputStream.java:733) 2018-06-18 19:50:23,082 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:50:23,084 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:50:23,089 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@a9e869cidentifier=tcp://laptop-name:42919/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1451dfc6{da=com.datatorrent.bufferserver.internal.DataList$Block@2f6e02b3{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=127, starting_window=5b28089000000001, ending_window=5b28089000000003, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@45fbf1ff[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000085 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23945 Log Contents: 2018-06-18 19:44:02,093 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:44:03,284 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000085/tmp as the basepath for spooling. 2018-06-18 19:44:03,289 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:34579 2018-06-18 19:44:04,392 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:44:04,491 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:44:04,510 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:34579/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:44:04,566 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000085/tmp/chkp8755946449887651686 as the basepath for checkpointing. 2018-06-18 19:44:04,574 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:44:04,693 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@27126894 for node 2 2018-06-18 19:44:04,706 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:44:04,707 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745458_4634 2018-06-18 19:44:04,709 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:44:04,711 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:44:06,524 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:44:06,527 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:44:06,532 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@579a14f3identifier=tcp://laptop-name:34579/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1067c523{da=com.datatorrent.bufferserver.internal.DataList$Block@72528f11{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=947, starting_window=5b28089000000001, ending_window=5b2808900000000b, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@6e40ab3[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000052 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23945 Log Contents: 2018-06-18 19:39:01,256 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:39:02,415 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000052/tmp as the basepath for spooling. 2018-06-18 19:39:02,420 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:39383 2018-06-18 19:39:03,510 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:39:03,616 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:39:03,688 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000052/tmp/chkp6282085953677179038 as the basepath for checkpointing. 2018-06-18 19:39:03,695 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:39:03,811 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@10dbc3bf for node 2 2018-06-18 19:39:03,851 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:39:03,852 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744829_4005 2018-06-18 19:39:03,854 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:39:03,855 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:39:03,940 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:39383/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:39:05,637 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:39:05,640 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:39:05,645 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@d2c920cidentifier=tcp://laptop-name:39383/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7e44c500{da=com.datatorrent.bufferserver.internal.DataList$Block@7a37e6f0{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=465, starting_window=5b28089000000001, ending_window=5b28089000000007, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@7efd6754[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000019 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:34:00,342 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:34:01,501 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000019/tmp as the basepath for spooling. 2018-06-18 19:34:01,505 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:32783 2018-06-18 19:34:02,602 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:34:02,707 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:32783/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:34:02,743 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:34:02,801 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000019/tmp/chkp6876493291957271865 as the basepath for checkpointing. 2018-06-18 19:34:02,809 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:34:02,932 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@2c6bd55d for node 2 2018-06-18 19:34:02,943 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:34:02,944 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744198_3374 2018-06-18 19:34:02,946 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:34:02,947 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:34:04,777 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:34:04,780 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:34:04,786 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7646ee46identifier=tcp://laptop-name:32783/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3a12f49d{da=com.datatorrent.bufferserver.internal.DataList$Block@4ae850c0{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=465, starting_window=5b28089000000001, ending_window=5b28089000000007, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@16dbd0ea[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000118 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23944 Log Contents: 2018-06-18 19:49:05,719 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:49:07,143 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000118/tmp as the basepath for spooling. 2018-06-18 19:49:07,146 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:46121 2018-06-18 19:49:08,240 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:49:08,368 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:49:08,432 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:46121/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:49:08,446 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000118/tmp/chkp1729709706710951999 as the basepath for checkpointing. 2018-06-18 19:49:08,455 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:49:08,572 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@f215a2d for node 2 2018-06-18 19:49:08,574 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:49:08,575 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746095_5271 2018-06-18 19:49:08,577 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:49:08,578 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:49:10,386 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:49:10,388 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:49:10,390 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@1a1b9eaidentifier=tcp://laptop-name:46121/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4290e0f4{da=com.datatorrent.bufferserver.internal.DataList$Block@3e28c5f7{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=198, starting_window=5b28089000000001, ending_window=5b28089000000004, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@537cdf7e[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000077 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20914 Log Contents: 2018-06-18 19:42:49,095 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:42:50,238 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000077/tmp as the basepath for spooling. 2018-06-18 19:42:50,242 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:40569 2018-06-18 19:42:51,340 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:42:51,444 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:42:51,517 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000077/tmp/chkp3439267870682945533 as the basepath for checkpointing. 2018-06-18 19:42:51,526 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:42:51,642 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@dcac4f1 for node 2 2018-06-18 19:42:51,731 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:40569/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:42:51,756 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:42:51,757 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:42:53,472 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:42:53,475 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:42:53,479 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@3e3abc23identifier=tcp://laptop-name:40569/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@14ec1341{da=com.datatorrent.bufferserver.internal.DataList$Block@5ea34387{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=813, starting_window=5b28089000000001, ending_window=5b2808900000000a, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@42857751[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000044 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23945 Log Contents: 2018-06-18 19:37:48,326 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:37:49,516 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000044/tmp as the basepath for spooling. 2018-06-18 19:37:49,520 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:43271 2018-06-18 19:37:50,609 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:43271/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:37:50,646 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:37:50,734 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:37:50,799 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000044/tmp/chkp2659981686079971283 as the basepath for checkpointing. 2018-06-18 19:37:50,805 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:37:50,922 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@188480f6 for node 2 2018-06-18 19:37:50,931 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:37:50,931 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744675_3851 2018-06-18 19:37:50,933 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:37:50,934 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:37:52,772 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:37:52,775 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:37:52,779 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@8d1d82cidentifier=tcp://laptop-name:43271/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@63e0175c{da=com.datatorrent.bufferserver.internal.DataList$Block@15108860{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=278, starting_window=5b28089000000001, ending_window=5b28089000000005, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@3050f607[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000011 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:32:47,295 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:32:48,640 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000011/tmp as the basepath for spooling. 2018-06-18 19:32:48,644 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:39771 2018-06-18 19:32:49,733 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:32:49,834 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:32:49,904 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000011/tmp/chkp4724645281683847358 as the basepath for checkpointing. 2018-06-18 19:32:49,910 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:32:50,029 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@1ec14090 for node 2 2018-06-18 19:32:50,038 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:32:50,039 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744045_3221 2018-06-18 19:32:50,041 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:32:50,043 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:32:50,131 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:39771/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:32:51,863 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:32:51,872 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:32:51,872 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@4413e99fidentifier=tcp://laptop-name:39771/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@649e17de{da=com.datatorrent.bufferserver.internal.DataList$Block@2accef54{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=278, starting_window=5b28089000000001, ending_window=5b28089000000005, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@2c195f8a[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000143 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22166 Log Contents: 2018-06-18 19:52:53,540 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:52:54,755 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000143/tmp as the basepath for spooling. 2018-06-18 19:52:54,758 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:34799 2018-06-18 19:52:55,842 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:52:56,005 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:52:56,032 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:34799/2.out.1, windowId=5b28089000000032, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:52:56,094 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000143/tmp/chkp7750700188054238220 as the basepath for checkpointing. 2018-06-18 19:52:56,100 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:52:56,224 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@352aa3f1 for node 2 2018-06-18 19:52:56,272 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:52:56,277 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746572_5748 2018-06-18 19:52:56,280 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:52:58,022 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:52:58,025 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:52:58,032 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@648c901aidentifier=tcp://laptop-name:34799/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@742a94e4{da=com.datatorrent.bufferserver.internal.DataList$Block@5bd060cf{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=367, starting_window=5b28089000000001, ending_window=5b28089000000006, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@3650e23a[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000110 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22163 Log Contents: 2018-06-18 19:47:52,070 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:47:53,591 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000110/tmp as the basepath for spooling. 2018-06-18 19:47:53,599 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:39119 2018-06-18 19:47:54,729 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:47:54,841 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:47:54,917 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000110/tmp/chkp1200186845952838159 as the basepath for checkpointing. 2018-06-18 19:47:54,925 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:47:55,044 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@58e6a91 for node 2 2018-06-18 19:47:55,053 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:47:55,054 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745940_5116 2018-06-18 19:47:55,056 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:47:55,072 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:39119/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:47:56,879 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:47:56,889 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@c311cb8identifier=tcp://laptop-name:39119/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5ebdb22{da=com.datatorrent.bufferserver.internal.DataList$Block@1a5597d4{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=572, starting_window=5b28089000000001, ending_window=5b28089000000008, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@54a01df1[identifier=2.out.1] 2018-06-18 19:47:56,890 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000069 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22161 Log Contents: 2018-06-18 19:41:36,208 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:41:37,392 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000069/tmp as the basepath for spooling. 2018-06-18 19:41:37,395 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:41671 2018-06-18 19:41:38,468 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:41:38,558 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:41:38,630 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000069/tmp/chkp375685388153228307 as the basepath for checkpointing. 2018-06-18 19:41:38,636 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:41:38,757 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@e62ad1 for node 2 2018-06-18 19:41:38,773 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41671/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:41:38,774 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:41:38,775 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745151_4327 2018-06-18 19:41:38,784 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:41:40,583 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:41:40,585 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:41:40,588 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6b96bbb7identifier=tcp://laptop-name:41671/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@74a4a35{da=com.datatorrent.bufferserver.internal.DataList$Block@c9b8c3e{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=278, starting_window=5b28089000000001, ending_window=5b28089000000005, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@3d27ccfd[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000036 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:36:35,408 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:36:36,553 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000036/tmp as the basepath for spooling. 2018-06-18 19:36:36,557 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:35763 2018-06-18 19:36:37,666 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:36:37,695 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:35763/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:36:37,788 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:36:37,845 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000036/tmp/chkp8880422483845226220 as the basepath for checkpointing. 2018-06-18 19:36:37,856 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:36:37,974 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@5f0700e3 for node 2 2018-06-18 19:36:37,976 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:36:37,976 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744521_3697 2018-06-18 19:36:37,978 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:36:37,979 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:36:39,819 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:36:39,822 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:36:39,826 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@2488f715identifier=tcp://laptop-name:35763/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2b378a35{da=com.datatorrent.bufferserver.internal.DataList$Block@2bebb9c1{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=947, starting_window=5b28089000000001, ending_window=5b2808900000000b, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@343d10d1[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000003 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:376271 Log Contents: 2018-06-18 19:31:41,398 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:31:43,501 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000003/tmp as the basepath for spooling. 2018-06-18 19:31:43,506 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:41209 2018-06-18 19:31:44,581 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:31:45,091 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:31:45,632 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=1,name=SequenceGenerator,type=INPUT,checkpoint={ffffffffffffffff, 0, 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=aggregate,bufferServer=laptop-name]]]] 2018-06-18 19:31:45,645 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:31:45,719 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=1.out.1, windowId=ffffffffffffffff} 2018-06-18 19:31:45,721 INFO com.datatorrent.stram.engine.WindowGenerator: Catching up from 1529350288500 to 1529350305721 2018-06-18 19:31:45,794 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000003/tmp/chkp2236443757530963049 as the basepath for checkpointing. 2018-06-18 19:31:47,745 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@3ba312bf{ln=LogicalNode@a48f2e4identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@755c1ed{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=987, starting_window=5b28089000000001, ending_window=5b28089000000029, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Broken pipe at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:31:47,766 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@a48f2e4identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@755c1ed{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=987, starting_window=5b28089000000001, ending_window=5b28089000000029, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:31:54,666 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:31:57,244 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@696a728e{ln=LogicalNode@4822c96cidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@351167d3{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=1422, starting_window=5b28089000000001, ending_window=5b2808900000003b, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:31:57,246 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@4822c96cidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@351167d3{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=1422, starting_window=5b28089000000001, ending_window=5b2808900000003b, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:32:03,812 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:32:06,241 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@578667df{ln=LogicalNode@35e8b35fidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@798f8183{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=1918, starting_window=5b28089000000001, ending_window=5b2808900000004e, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Broken pipe at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:32:06,244 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@35e8b35fidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@798f8183{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=1918, starting_window=5b28089000000001, ending_window=5b2808900000004e, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:32:13,058 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:32:15,245 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@f2c596e{ln=LogicalNode@511b884eidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@49559fc3{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=2404, starting_window=5b28089000000001, ending_window=5b28089000000060, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Broken pipe at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:32:15,247 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@511b884eidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@49559fc3{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=2404, starting_window=5b28089000000001, ending_window=5b28089000000060, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:32:22,106 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:32:24,241 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@3547b52d{ln=LogicalNode@45c58a91identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@38769955{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=2875, starting_window=5b28089000000001, ending_window=5b28089000000072, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Broken pipe at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:32:24,241 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@45c58a91identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@38769955{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=2890, starting_window=5b28089000000001, ending_window=5b28089000000072, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:32:31,538 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:32:33,741 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@28cf5c6e{ln=LogicalNode@259bf8c1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@124587b4{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=3388, starting_window=5b28089000000001, ending_window=5b28089000000085, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Broken pipe at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:32:33,743 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@259bf8c1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@124587b4{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=3403, starting_window=5b28089000000001, ending_window=5b28089000000085, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:32:40,330 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:32:42,738 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@2d7dcf82{ln=LogicalNode@440559beidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6b3e9e65{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=3889, starting_window=5b28089000000001, ending_window=5b28089000000097, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Broken pipe at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:32:42,740 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@440559beidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6b3e9e65{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=3889, starting_window=5b28089000000001, ending_window=5b28089000000097, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:32:49,853 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:32:52,241 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@167b417c{ln=LogicalNode@6cdcfe82identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4adbdfe3{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=4402, starting_window=5b28089000000001, ending_window=5b280890000000aa, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Broken pipe at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:32:52,242 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6cdcfe82identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4adbdfe3{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=4402, starting_window=5b28089000000001, ending_window=5b280890000000aa, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:32:58,760 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:33:01,240 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@2f1e2d44{ln=LogicalNode@413859b0identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@589c2c7e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=4888, starting_window=5b28089000000001, ending_window=5b280890000000bc, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Broken pipe at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:33:01,242 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@413859b0identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@589c2c7e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=4888, starting_window=5b28089000000001, ending_window=5b280890000000bc, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:33:08,032 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:33:10,243 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@d04804a{ln=LogicalNode@6a74786identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@21dda9c0{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=5374, starting_window=5b28089000000001, ending_window=5b280890000000ce, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Broken pipe at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:33:10,244 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6a74786identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@21dda9c0{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=5374, starting_window=5b28089000000001, ending_window=5b280890000000ce, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:33:17,182 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:33:19,240 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@588d2e1b{ln=LogicalNode@5bfb1963identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@545d4750{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=5860, starting_window=5b28089000000001, ending_window=5b280890000000e0, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Broken pipe at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:33:19,241 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@5bfb1963identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@545d4750{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=5860, starting_window=5b28089000000001, ending_window=5b280890000000e0, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:33:26,299 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:33:28,746 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@72d4c486{ln=LogicalNode@4ef3262didentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1222f450{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=6373, starting_window=5b28089000000001, ending_window=5b280890000000f3, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Broken pipe at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:33:28,747 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@4ef3262didentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1222f450{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=6373, starting_window=5b28089000000001, ending_window=5b280890000000f3, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:33:35,338 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:33:37,744 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@7fbf341e{ln=LogicalNode@7c0c0a8bidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@11b4e2e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=6859, starting_window=5b28089000000001, ending_window=5b28089000000105, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Broken pipe at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:33:37,745 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7c0c0a8bidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@11b4e2e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=6859, starting_window=5b28089000000001, ending_window=5b28089000000105, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:33:44,480 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:33:46,740 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@28281178{ln=LogicalNode@7fbd981eidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@296973b9{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=7322, starting_window=5b28089000000001, ending_window=5b28089000000116, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:33:46,741 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7fbd981eidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@296973b9{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=7322, starting_window=5b28089000000001, ending_window=5b28089000000116, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:33:53,528 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:33:55,738 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@79244c33{ln=LogicalNode@65e059ceidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@48d1fe8c{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=7808, starting_window=5b28089000000001, ending_window=5b28089000000128, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:33:55,739 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@65e059ceidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@48d1fe8c{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=7808, starting_window=5b28089000000001, ending_window=5b28089000000128, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:34:02,744 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:34:05,245 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@d2b27db{ln=LogicalNode@1ec53308identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6d878116{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=8321, starting_window=5b28089000000001, ending_window=5b2808900000013b, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:34:05,246 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@1ec53308identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6d878116{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=8321, starting_window=5b28089000000001, ending_window=5b2808900000013b, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:34:11,883 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:34:14,244 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@455bb137{ln=LogicalNode@2212be7bidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@27f36dbb{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=8807, starting_window=5b28089000000001, ending_window=5b2808900000014d, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:34:14,245 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@2212be7bidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@27f36dbb{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=8807, starting_window=5b28089000000001, ending_window=5b2808900000014d, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:34:20,921 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:34:23,240 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@7653501d{ln=LogicalNode@41b970acidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@489e00c6{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=9293, starting_window=5b28089000000001, ending_window=5b2808900000015f, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:34:23,242 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@41b970acidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@489e00c6{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=9293, starting_window=5b28089000000001, ending_window=5b2808900000015f, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:34:30,236 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:34:32,743 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@59f3b3a5{ln=LogicalNode@6ea53f86identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@514c57d1{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=9806, starting_window=5b28089000000001, ending_window=5b28089000000172, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:34:32,744 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6ea53f86identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@514c57d1{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=9806, starting_window=5b28089000000001, ending_window=5b28089000000172, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:34:39,196 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:34:41,238 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@29a49f7{ln=LogicalNode@8ac6518identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7755fd4{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=10265, starting_window=5b28089000000001, ending_window=5b28089000000183, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:34:41,240 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@8ac6518identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7755fd4{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=10265, starting_window=5b28089000000001, ending_window=5b28089000000183, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:34:48,236 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:34:50,740 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@4c069133{ln=LogicalNode@b98de3aidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5f08894c{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=10778, starting_window=5b28089000000001, ending_window=5b28089000000196, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:34:50,741 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@b98de3aidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5f08894c{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=10778, starting_window=5b28089000000001, ending_window=5b28089000000196, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:34:57,486 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:34:59,740 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@5b9dfb4f{ln=LogicalNode@3072c181identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2a042952{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=11264, starting_window=5b28089000000001, ending_window=5b280890000001a8, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:34:59,741 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@3072c181identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2a042952{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=11264, starting_window=5b28089000000001, ending_window=5b280890000001a8, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:35:06,458 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:35:08,735 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@480551b5{ln=LogicalNode@5a13f81identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@31a28946{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=11750, starting_window=5b28089000000001, ending_window=5b280890000001ba, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:35:08,736 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@5a13f81identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@31a28946{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=11750, starting_window=5b28089000000001, ending_window=5b280890000001ba, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:35:15,624 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:35:17,741 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@31d2ebf9{ln=LogicalNode@178a4a8bidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@41d02dc7{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=12236, starting_window=5b28089000000001, ending_window=5b280890000001cc, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:35:17,741 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@178a4a8bidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@41d02dc7{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=12236, starting_window=5b28089000000001, ending_window=5b280890000001cc, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:35:24,834 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:35:27,242 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@75602936{ln=LogicalNode@79b15d2cidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5351e70a{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=12749, starting_window=5b28089000000001, ending_window=5b280890000001df, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:35:27,244 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@79b15d2cidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5351e70a{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=12749, starting_window=5b28089000000001, ending_window=5b280890000001df, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:35:33,830 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:35:36,240 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@5f1afcc0{ln=LogicalNode@1a013bbaidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6e5d721{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=13235, starting_window=5b28089000000001, ending_window=5b280890000001f1, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:35:36,241 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@1a013bbaidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6e5d721{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=13235, starting_window=5b28089000000001, ending_window=5b280890000001f1, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:35:43,002 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:35:45,240 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@6edc9855{ln=LogicalNode@1d77f09fidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@47a7ad3e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=13721, starting_window=5b28089000000001, ending_window=5b28089000000203, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:35:45,241 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@1d77f09fidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@47a7ad3e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=13721, starting_window=5b28089000000001, ending_window=5b28089000000203, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:35:52,114 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:35:54,243 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@1503e30d{ln=LogicalNode@64cd6fcidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@18ee50e5{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=14207, starting_window=5b28089000000001, ending_window=5b28089000000215, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:35:54,244 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@64cd6fcidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@18ee50e5{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=14207, starting_window=5b28089000000001, ending_window=5b28089000000215, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:36:01,298 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:36:03,739 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@cd8e850{ln=LogicalNode@2e09125cidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@38b5ed6c{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=14720, starting_window=5b28089000000001, ending_window=5b28089000000228, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:36:03,740 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@2e09125cidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@38b5ed6c{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=14720, starting_window=5b28089000000001, ending_window=5b28089000000228, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:36:08,256 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:370) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:546) 2018-06-18 19:36:10,353 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:36:12,739 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@2418871b{ln=LogicalNode@44034e52identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3340eb3{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=15206, starting_window=5b28089000000001, ending_window=5b2808900000023a, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:36:12,740 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@44034e52identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3340eb3{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=15206, starting_window=5b28089000000001, ending_window=5b2808900000023a, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:36:19,441 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:36:21,736 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@f52a4ea{ln=LogicalNode@69afccaeidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@76932924{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=15692, starting_window=5b28089000000001, ending_window=5b2808900000024c, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:36:21,737 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@69afccaeidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@76932924{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=15692, starting_window=5b28089000000001, ending_window=5b2808900000024c, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:36:28,672 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:36:30,745 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@96fbedf{ln=LogicalNode@57c0007eidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@55b8d655{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=16178, starting_window=5b28089000000001, ending_window=5b2808900000025e, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:36:30,746 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@57c0007eidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@55b8d655{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=16178, starting_window=5b28089000000001, ending_window=5b2808900000025e, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:36:37,789 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:36:40,239 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@4d2f8b98{ln=LogicalNode@5af2e42fidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@17d1aea6{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=16691, starting_window=5b28089000000001, ending_window=5b28089000000271, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:36:40,240 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@5af2e42fidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@17d1aea6{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=16691, starting_window=5b28089000000001, ending_window=5b28089000000271, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:36:46,851 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:36:49,237 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@2e9f20e8{ln=LogicalNode@790b16b5identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3294e001{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=17177, starting_window=5b28089000000001, ending_window=5b28089000000283, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:36:49,238 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@790b16b5identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3294e001{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=17177, starting_window=5b28089000000001, ending_window=5b28089000000283, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:36:55,993 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:36:58,240 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@48c92147{ln=LogicalNode@55777edidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7270c88f{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=17663, starting_window=5b28089000000001, ending_window=5b28089000000295, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:36:58,242 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@55777edidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7270c88f{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=17686, starting_window=5b28089000000001, ending_window=5b28089000000296, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:37:05,042 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:37:07,235 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@52189fcf{ln=LogicalNode@73879587identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6a457202{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=18149, starting_window=5b28089000000001, ending_window=5b280890000002a7, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:37:07,236 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@73879587identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6a457202{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=18149, starting_window=5b28089000000001, ending_window=5b280890000002a7, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:37:14,236 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:37:16,740 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@72c541a9{ln=LogicalNode@4c13a803identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5a7e9074{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=18662, starting_window=5b28089000000001, ending_window=5b280890000002ba, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:37:16,741 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@4c13a803identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5a7e9074{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=18662, starting_window=5b28089000000001, ending_window=5b280890000002ba, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:37:23,317 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:37:25,743 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@5a63b0a9{ln=LogicalNode@15073dcfidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@27d4f1ad{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=19148, starting_window=5b28089000000001, ending_window=5b280890000002cc, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:37:25,745 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@15073dcfidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@27d4f1ad{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=19148, starting_window=5b28089000000001, ending_window=5b280890000002cc, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:37:32,432 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:37:34,741 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@4d17fecf{ln=LogicalNode@1c1504deidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@44577a56{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=19634, starting_window=5b28089000000001, ending_window=5b280890000002de, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:37:34,742 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@1c1504deidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@44577a56{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=19634, starting_window=5b28089000000001, ending_window=5b280890000002de, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:37:41,586 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:37:43,738 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@5bb4090d{ln=LogicalNode@7771bef1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1d791fec{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=20120, starting_window=5b28089000000001, ending_window=5b280890000002f0, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:37:43,740 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7771bef1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1d791fec{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=20120, starting_window=5b28089000000001, ending_window=5b280890000002f0, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:37:50,735 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:37:53,239 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@5e2c756{ln=LogicalNode@31e510c0identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6d022437{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=20633, starting_window=5b28089000000001, ending_window=5b28089000000303, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:37:53,240 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@31e510c0identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6d022437{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=20633, starting_window=5b28089000000001, ending_window=5b28089000000303, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:37:59,887 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:38:02,235 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@4a629b90{ln=LogicalNode@309e321bidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4eb57a65{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=21119, starting_window=5b28089000000001, ending_window=5b28089000000315, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:38:02,237 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@309e321bidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4eb57a65{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=21119, starting_window=5b28089000000001, ending_window=5b28089000000315, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:38:08,839 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:38:11,242 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@23d58b29{ln=LogicalNode@3a2ffc83identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6355e57b{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=21605, starting_window=5b28089000000001, ending_window=5b28089000000327, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:38:11,243 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@3a2ffc83identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6355e57b{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=21605, starting_window=5b28089000000001, ending_window=5b28089000000327, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:38:17,984 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:38:20,244 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@644f4e4d{ln=LogicalNode@2500a635identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@903747d{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=22091, starting_window=5b28089000000001, ending_window=5b28089000000339, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:38:20,245 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@2500a635identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@903747d{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=22091, starting_window=5b28089000000001, ending_window=5b28089000000339, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:38:27,181 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:38:29,237 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@2a03ed7f{ln=LogicalNode@6f0dd758identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@517d1eb2{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=22577, starting_window=5b28089000000001, ending_window=5b2808900000034b, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:38:29,238 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6f0dd758identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@517d1eb2{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=22577, starting_window=5b28089000000001, ending_window=5b2808900000034b, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:38:36,274 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:38:38,743 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@712ec1d2{ln=LogicalNode@513ff0ccidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@dddcfba{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=23090, starting_window=5b28089000000001, ending_window=5b2808900000035e, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:38:38,744 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@513ff0ccidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@dddcfba{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=23090, starting_window=5b28089000000001, ending_window=5b2808900000035e, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:38:45,377 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:38:47,736 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@64e28781{ln=LogicalNode@65543b52identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@133a29f0{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=23576, starting_window=5b28089000000001, ending_window=5b28089000000370, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:38:47,737 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@65543b52identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@133a29f0{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=23576, starting_window=5b28089000000001, ending_window=5b28089000000370, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:38:54,451 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:38:56,740 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@6273567b{ln=LogicalNode@4c646369identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@78d6b34e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=24062, starting_window=5b28089000000001, ending_window=5b28089000000382, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:38:56,741 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@4c646369identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@78d6b34e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=24062, starting_window=5b28089000000001, ending_window=5b28089000000382, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:39:03,635 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:39:05,743 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@3c4a3beb{ln=LogicalNode@7880099bidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@102c9310{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=24548, starting_window=5b28089000000001, ending_window=5b28089000000394, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:39:05,744 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7880099bidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@102c9310{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=24548, starting_window=5b28089000000001, ending_window=5b28089000000394, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:39:12,682 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:39:14,743 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@38587e{ln=LogicalNode@6f281c3bidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@438cbc53{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=25034, starting_window=5b28089000000001, ending_window=5b280890000003a6, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:39:14,743 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6f281c3bidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@438cbc53{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=25034, starting_window=5b28089000000001, ending_window=5b280890000003a6, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:39:21,820 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:39:24,243 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@250f756{ln=LogicalNode@7a64ea56identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@53bb2de{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=25547, starting_window=5b28089000000001, ending_window=5b280890000003b9, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:39:24,245 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7a64ea56identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@53bb2de{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=25570, starting_window=5b28089000000001, ending_window=5b280890000003ba, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:39:30,984 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:39:33,241 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@38dda6cc{ln=LogicalNode@17bbbc28identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@76c7fa38{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=26033, starting_window=5b28089000000001, ending_window=5b280890000003cb, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:39:33,243 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@17bbbc28identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@76c7fa38{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=26033, starting_window=5b28089000000001, ending_window=5b280890000003cb, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:39:39,952 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:39:42,243 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@12d413ce{ln=LogicalNode@4f3f6d98identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6ba7a653{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=26519, starting_window=5b28089000000001, ending_window=5b280890000003dd, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:39:42,244 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@4f3f6d98identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6ba7a653{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=26519, starting_window=5b28089000000001, ending_window=5b280890000003dd, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:39:49,169 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:39:51,236 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@1a814975{ln=LogicalNode@5011ecf1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@67405311{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=27005, starting_window=5b28089000000001, ending_window=5b280890000003ef, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:39:51,237 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@5011ecf1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@67405311{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=27005, starting_window=5b28089000000001, ending_window=5b280890000003ef, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:39:58,235 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:40:00,743 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@6269df5e{ln=LogicalNode@568e3d26identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@37645f4e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=27518, starting_window=5b28089000000001, ending_window=5b28089000000402, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:40:00,744 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@568e3d26identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@37645f4e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=27518, starting_window=5b28089000000001, ending_window=5b28089000000402, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:40:07,511 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:40:09,738 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@46541c63{ln=LogicalNode@197d063cidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@be2052b{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=28004, starting_window=5b28089000000001, ending_window=5b28089000000414, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:40:09,739 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@197d063cidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@be2052b{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=28004, starting_window=5b28089000000001, ending_window=5b28089000000414, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:40:16,498 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:40:18,736 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@3542b962{ln=LogicalNode@1202c5e1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7f4dd28e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=28490, starting_window=5b28089000000001, ending_window=5b28089000000426, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:40:18,737 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@1202c5e1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7f4dd28e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=28490, starting_window=5b28089000000001, ending_window=5b28089000000426, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:40:25,615 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:40:27,743 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@745364c6{ln=LogicalNode@5eec096identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@681ea5a7{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=28976, starting_window=5b28089000000001, ending_window=5b28089000000438, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:40:27,744 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@5eec096identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@681ea5a7{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=28976, starting_window=5b28089000000001, ending_window=5b28089000000438, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:40:34,785 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:40:37,243 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@25c7f0e0{ln=LogicalNode@216e1474identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@34c55538{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=29489, starting_window=5b28089000000001, ending_window=5b2808900000044b, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:40:37,244 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@216e1474identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@34c55538{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=29489, starting_window=5b28089000000001, ending_window=5b2808900000044b, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:40:43,850 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:40:46,238 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@4ef421cb{ln=LogicalNode@3dedb2f3identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7fb929db{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=29975, starting_window=5b28089000000001, ending_window=5b2808900000045d, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:40:46,240 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@3dedb2f3identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7fb929db{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=29975, starting_window=5b28089000000001, ending_window=5b2808900000045d, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:40:53,163 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:40:55,241 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@27b7155e{ln=LogicalNode@76ab7868identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@cec915{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=30461, starting_window=5b28089000000001, ending_window=5b2808900000046f, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:40:55,242 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@76ab7868identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@cec915{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=30461, starting_window=5b28089000000001, ending_window=5b2808900000046f, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:41:02,055 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:41:04,238 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@19bef50f{ln=LogicalNode@4914b6e1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4db14b9f{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=30947, starting_window=5b28089000000001, ending_window=5b28089000000481, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:41:04,240 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@4914b6e1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4db14b9f{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=30970, starting_window=5b28089000000001, ending_window=5b28089000000482, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:41:11,367 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:41:13,744 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@56164778{ln=LogicalNode@7b531deaidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@47015e93{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=31460, starting_window=5b28089000000001, ending_window=5b28089000000494, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:41:13,745 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7b531deaidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@47015e93{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=31460, starting_window=5b28089000000001, ending_window=5b28089000000494, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:41:20,410 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:41:22,736 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@152b8902{ln=LogicalNode@796496f8identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@385843b3{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=31946, starting_window=5b28089000000001, ending_window=5b280890000004a6, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:41:22,738 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@796496f8identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@385843b3{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=31946, starting_window=5b28089000000001, ending_window=5b280890000004a6, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:41:29,556 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:41:31,744 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@1dad232b{ln=LogicalNode@7ef2d139identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2e568ff0{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=32432, starting_window=5b28089000000001, ending_window=5b280890000004b8, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:41:31,745 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7ef2d139identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2e568ff0{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=32432, starting_window=5b28089000000001, ending_window=5b280890000004b8, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:41:38,577 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:41:40,736 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@37d8347c{ln=LogicalNode@6ad6a729identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4309c0b{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=32918, starting_window=5b28089000000001, ending_window=5b280890000004ca, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:41:40,737 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6ad6a729identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4309c0b{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=32918, starting_window=5b28089000000001, ending_window=5b280890000004ca, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:41:47,883 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:41:50,242 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@1e166523{ln=LogicalNode@760504c1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@224571a5{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=33454, starting_window=5b28089000000001, ending_window=5b280890000004de, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:41:50,244 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@760504c1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@224571a5{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=33454, starting_window=5b28089000000001, ending_window=5b280890000004de, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:41:56,879 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:41:59,244 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@8a5a9{ln=LogicalNode@5f38a56eidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@bab2{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=33917, starting_window=5b28089000000001, ending_window=5b280890000004ef, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:41:59,245 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@5f38a56eidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@bab2{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=33917, starting_window=5b28089000000001, ending_window=5b280890000004ef, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:42:05,984 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:42:08,242 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@72aafcd7{ln=LogicalNode@2cd7dd4fidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@117ab7d8{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=34403, starting_window=5b28089000000001, ending_window=5b28089000000501, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:42:08,243 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@2cd7dd4fidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@117ab7d8{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=34403, starting_window=5b28089000000001, ending_window=5b28089000000501, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:42:14,971 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:42:17,242 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@69b71091{ln=LogicalNode@73bedb1cidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@32fa88a3{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=34889, starting_window=5b28089000000001, ending_window=5b28089000000513, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:42:17,243 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@73bedb1cidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@32fa88a3{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=34889, starting_window=5b28089000000001, ending_window=5b28089000000513, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:42:24,124 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:42:26,235 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@56d7ddf8{ln=LogicalNode@44784fc8identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2c2f6856{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=35375, starting_window=5b28089000000001, ending_window=5b28089000000525, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:42:26,236 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@44784fc8identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2c2f6856{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=35375, starting_window=5b28089000000001, ending_window=5b28089000000525, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:42:33,374 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:42:35,741 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@159afc4e{ln=LogicalNode@1079a3daidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@765990d1{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=35888, starting_window=5b28089000000001, ending_window=5b28089000000538, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:42:35,742 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@1079a3daidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@765990d1{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=35888, starting_window=5b28089000000001, ending_window=5b28089000000538, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:42:42,381 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:42:44,744 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@78a99430{ln=LogicalNode@7ea992e2identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3bab9a53{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=36374, starting_window=5b28089000000001, ending_window=5b2808900000054a, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:42:44,745 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7ea992e2identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3bab9a53{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=36374, starting_window=5b28089000000001, ending_window=5b2808900000054a, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:42:51,462 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:42:53,740 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@3c1369f5{ln=LogicalNode@61296adeidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@305aaf26{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=36860, starting_window=5b28089000000001, ending_window=5b2808900000055c, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:42:53,741 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@61296adeidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@305aaf26{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=36860, starting_window=5b28089000000001, ending_window=5b2808900000055c, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:43:00,606 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:43:02,737 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@55156adf{ln=LogicalNode@6719cc1didentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5267eea3{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=37346, starting_window=5b28089000000001, ending_window=5b2808900000056e, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:43:02,738 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6719cc1didentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5267eea3{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=37346, starting_window=5b28089000000001, ending_window=5b2808900000056e, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:43:09,674 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:43:11,738 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@40624dca{ln=LogicalNode@6ab7b082identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@15602c5e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=37832, starting_window=5b28089000000001, ending_window=5b28089000000580, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:43:11,739 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6ab7b082identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@15602c5e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=37832, starting_window=5b28089000000001, ending_window=5b28089000000580, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:43:19,645 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:43:21,738 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@148a6c99{ln=LogicalNode@3dda3331identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3ae6d515{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=38395, starting_window=5b28089000000001, ending_window=5b28089000000595, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:43:21,738 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@3dda3331identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3ae6d515{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=38395, starting_window=5b28089000000001, ending_window=5b28089000000595, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:43:28,098 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:43:30,238 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@6f04255d{ln=LogicalNode@38e0ab9fidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@63c68b53{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=38831, starting_window=5b28089000000001, ending_window=5b280890000005a5, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:43:30,239 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@38e0ab9fidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@63c68b53{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=38831, starting_window=5b28089000000001, ending_window=5b280890000005a5, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:43:37,038 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:43:39,236 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@7f49e6be{ln=LogicalNode@13e611ddidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@c47e4fa{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=39317, starting_window=5b28089000000001, ending_window=5b280890000005b7, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:43:39,237 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@13e611ddidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@c47e4fa{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=39317, starting_window=5b28089000000001, ending_window=5b280890000005b7, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:43:46,324 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:43:48,740 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@5d06e031{ln=LogicalNode@6087045didentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@33a717b7{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=39830, starting_window=5b28089000000001, ending_window=5b280890000005ca, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:43:48,741 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6087045didentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@33a717b7{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=39830, starting_window=5b28089000000001, ending_window=5b280890000005ca, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:43:55,371 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:43:57,743 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@50cb62e5{ln=LogicalNode@7300f9d8identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2ebac3ee{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=40316, starting_window=5b28089000000001, ending_window=5b280890000005dc, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:43:57,744 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7300f9d8identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2ebac3ee{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=40316, starting_window=5b28089000000001, ending_window=5b280890000005dc, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:44:04,511 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:44:06,739 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@196d88e1{ln=LogicalNode@4190d27didentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@34612b2f{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=40802, starting_window=5b28089000000001, ending_window=5b280890000005ee, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:44:06,740 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@4190d27didentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@34612b2f{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=40802, starting_window=5b28089000000001, ending_window=5b280890000005ee, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:44:13,535 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:44:15,736 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@4a475ce0{ln=LogicalNode@3e7db249identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1d3fab6f{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=41288, starting_window=5b28089000000001, ending_window=5b28089000000600, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:44:15,737 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@3e7db249identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1d3fab6f{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=41288, starting_window=5b28089000000001, ending_window=5b28089000000600, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:44:22,683 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:44:24,745 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@4ba21fe7{ln=LogicalNode@ce63fd4identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3868a4f8{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=41774, starting_window=5b28089000000001, ending_window=5b28089000000612, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:44:24,745 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@ce63fd4identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3868a4f8{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=41774, starting_window=5b28089000000001, ending_window=5b28089000000612, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:44:32,016 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:44:34,240 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@7cb6d2f4{ln=LogicalNode@2f8ac8d1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@41f091d{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=42287, starting_window=5b28089000000001, ending_window=5b28089000000625, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:44:34,241 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@2f8ac8d1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@41f091d{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=42287, starting_window=5b28089000000001, ending_window=5b28089000000625, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:44:40,866 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:44:43,243 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@740ca7e0{ln=LogicalNode@49a92e2aidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1fa8559{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=42773, starting_window=5b28089000000001, ending_window=5b28089000000637, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:44:43,244 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@49a92e2aidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1fa8559{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=42773, starting_window=5b28089000000001, ending_window=5b28089000000637, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:44:49,982 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:44:52,244 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@3426499{ln=LogicalNode@7a77e860identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@75463322{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=43259, starting_window=5b28089000000001, ending_window=5b28089000000649, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:44:52,245 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7a77e860identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@75463322{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=43259, starting_window=5b28089000000001, ending_window=5b28089000000649, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:44:59,166 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:45:01,243 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@6b556369{ln=LogicalNode@6b0dd857identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5c1657d3{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=43745, starting_window=5b28089000000001, ending_window=5b2808900000065b, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:45:01,244 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6b0dd857identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5c1657d3{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=43745, starting_window=5b28089000000001, ending_window=5b2808900000065b, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:45:08,327 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:45:10,745 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@7b7b1f03{ln=LogicalNode@3e0acce4identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@21deefac{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=44258, starting_window=5b28089000000001, ending_window=5b2808900000066e, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:45:10,746 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@3e0acce4identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@21deefac{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=44258, starting_window=5b28089000000001, ending_window=5b2808900000066e, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:45:17,446 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:45:19,744 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@7e04ae67{ln=LogicalNode@7bce0e90identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@22a93207{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=44744, starting_window=5b28089000000001, ending_window=5b28089000000680, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:45:19,745 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7bce0e90identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@22a93207{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=44744, starting_window=5b28089000000001, ending_window=5b28089000000680, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:45:26,488 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:45:28,740 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@58ee85ed{ln=LogicalNode@30eb1ff8identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5432eb22{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=45230, starting_window=5b28089000000001, ending_window=5b28089000000692, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:45:28,741 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@30eb1ff8identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5432eb22{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=45230, starting_window=5b28089000000001, ending_window=5b28089000000692, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:45:35,713 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:45:37,736 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@6bc7e814{ln=LogicalNode@6076c172identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@d9317e8{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=45716, starting_window=5b28089000000001, ending_window=5b280890000006a4, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:45:37,737 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6076c172identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@d9317e8{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=45716, starting_window=5b28089000000001, ending_window=5b280890000006a4, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:45:44,912 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:45:47,236 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@4c1bf17c{ln=LogicalNode@24e733d8identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2bca8c7{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=46229, starting_window=5b28089000000001, ending_window=5b280890000006b7, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:45:47,236 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@24e733d8identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2bca8c7{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=46229, starting_window=5b28089000000001, ending_window=5b280890000006b7, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:45:53,860 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:45:56,236 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@70b4e818{ln=LogicalNode@269fca2didentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4d24fc5e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=46715, starting_window=5b28089000000001, ending_window=5b280890000006c9, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:45:56,237 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@269fca2didentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4d24fc5e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=46715, starting_window=5b28089000000001, ending_window=5b280890000006c9, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:46:02,948 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:46:05,240 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@359c4f93{ln=LogicalNode@2878d30bidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@54c5610f{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=47201, starting_window=5b28089000000001, ending_window=5b280890000006db, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:46:05,241 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@2878d30bidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@54c5610f{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=47201, starting_window=5b28089000000001, ending_window=5b280890000006db, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:46:12,029 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:46:14,242 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@7f5b0a34{ln=LogicalNode@274d7950identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2eca34f8{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=47687, starting_window=5b28089000000001, ending_window=5b280890000006ed, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:46:14,244 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@274d7950identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2eca34f8{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=47710, starting_window=5b28089000000001, ending_window=5b280890000006ee, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:46:21,331 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:46:23,742 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@70fa4afd{ln=LogicalNode@1ef42f62identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2609714e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=48200, starting_window=5b28089000000001, ending_window=5b28089000000700, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:46:23,743 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@1ef42f62identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2609714e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=48200, starting_window=5b28089000000001, ending_window=5b28089000000700, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:46:30,404 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:46:32,740 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@39d99c22{ln=LogicalNode@7512f5b0identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@97cb544{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=48686, starting_window=5b28089000000001, ending_window=5b28089000000712, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:46:32,742 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7512f5b0identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@97cb544{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=48709, starting_window=5b28089000000001, ending_window=5b28089000000713, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:46:39,643 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:46:41,741 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@571de44d{ln=LogicalNode@68ad5ef3identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4ac194fd{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=49172, starting_window=5b28089000000001, ending_window=5b28089000000724, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:46:41,742 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@68ad5ef3identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4ac194fd{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=49172, starting_window=5b28089000000001, ending_window=5b28089000000724, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:46:48,516 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:46:50,739 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@1de0e748{ln=LogicalNode@7f3d2e31identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6c77b984{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=49681, starting_window=5b28089000000001, ending_window=5b28089000000737, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:46:50,740 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7f3d2e31identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6c77b984{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=49681, starting_window=5b28089000000001, ending_window=5b28089000000737, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:46:57,532 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:46:59,744 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@47ba9217{ln=LogicalNode@45e216d5identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6ecab131{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=50144, starting_window=5b28089000000001, ending_window=5b28089000000748, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:46:59,745 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@45e216d5identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6ecab131{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=50144, starting_window=5b28089000000001, ending_window=5b28089000000748, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:47:06,759 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:47:09,240 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@5f227386{ln=LogicalNode@49f882f5identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@403595b0{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=50657, starting_window=5b28089000000001, ending_window=5b2808900000075b, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:47:09,241 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@49f882f5identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@403595b0{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=50657, starting_window=5b28089000000001, ending_window=5b2808900000075b, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:47:15,709 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:47:17,735 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@5ac18bf7{ln=LogicalNode@f406ac4identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4e58da50{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=51116, starting_window=5b28089000000001, ending_window=5b2808900000076c, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:47:17,736 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@f406ac4identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4e58da50{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=51116, starting_window=5b28089000000001, ending_window=5b2808900000076c, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:47:25,453 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:47:27,738 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@433286e9{ln=LogicalNode@4e504b50identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7ee96ba6{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=51656, starting_window=5b28089000000001, ending_window=5b28089000000780, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:47:27,739 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@4e504b50identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7ee96ba6{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=51656, starting_window=5b28089000000001, ending_window=5b28089000000780, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:47:35,330 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:47:37,742 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@1898ff5e{ln=LogicalNode@1402b0c3identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@45a3d40d{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=52219, starting_window=5b28089000000001, ending_window=5b28089000000795, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:47:37,743 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@1402b0c3identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@45a3d40d{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=52219, starting_window=5b28089000000001, ending_window=5b28089000000795, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:47:45,218 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:47:47,242 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@43d07964{ln=LogicalNode@f96a085identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5b7ec460{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=52709, starting_window=5b28089000000001, ending_window=5b280890000007a7, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:47:47,243 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@f96a085identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5b7ec460{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=52732, starting_window=5b28089000000001, ending_window=5b280890000007a8, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:47:54,861 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:47:57,240 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@50abdc26{ln=LogicalNode@512eea58identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@34236f6e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=53249, starting_window=5b28089000000001, ending_window=5b280890000007bb, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:47:57,241 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@512eea58identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@34236f6e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=53249, starting_window=5b28089000000001, ending_window=5b280890000007bb, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:48:03,425 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:48:05,735 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@1175b240{ln=LogicalNode@2bddba63identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@35fc465c{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=53708, starting_window=5b28089000000001, ending_window=5b280890000007cc, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:48:05,737 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@2bddba63identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@35fc465c{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=53708, starting_window=5b28089000000001, ending_window=5b280890000007cc, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:48:12,657 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:48:14,735 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@251909d2{ln=LogicalNode@3c35b03identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2cad17dc{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=54194, starting_window=5b28089000000001, ending_window=5b280890000007de, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:48:14,736 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@3c35b03identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2cad17dc{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=54194, starting_window=5b28089000000001, ending_window=5b280890000007de, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:48:21,542 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:48:23,737 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@3f181e89{ln=LogicalNode@7950a5abidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@474b8bc8{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=54680, starting_window=5b28089000000001, ending_window=5b280890000007f0, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:48:23,738 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7950a5abidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@474b8bc8{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=54680, starting_window=5b28089000000001, ending_window=5b280890000007f0, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:48:30,668 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:48:32,742 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@14ea74ca{ln=LogicalNode@d682e17identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@68f9d5d9{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=55166, starting_window=5b28089000000001, ending_window=5b28089000000802, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:48:32,743 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@d682e17identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@68f9d5d9{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=55166, starting_window=5b28089000000001, ending_window=5b28089000000802, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:48:39,800 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:48:42,243 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@72fb8565{ln=LogicalNode@efa1b6cidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@63c7b97c{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=55679, starting_window=5b28089000000001, ending_window=5b28089000000815, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:48:42,244 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@efa1b6cidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@63c7b97c{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=55679, starting_window=5b28089000000001, ending_window=5b28089000000815, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:48:48,820 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:48:51,241 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@49f1625d{ln=LogicalNode@525f5e2identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5d85df7{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=56165, starting_window=5b28089000000001, ending_window=5b28089000000827, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:48:51,242 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@525f5e2identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5d85df7{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=56165, starting_window=5b28089000000001, ending_window=5b28089000000827, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:48:58,916 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:49:01,239 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@2bdb888e{ln=LogicalNode@75a4d7e2identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4322397c{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=56705, starting_window=5b28089000000001, ending_window=5b2808900000083b, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:49:01,240 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@75a4d7e2identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4322397c{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=56705, starting_window=5b28089000000001, ending_window=5b2808900000083b, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:49:08,387 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:49:10,745 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@3705b081{ln=LogicalNode@1da5d966identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2729f670{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=57241, starting_window=5b28089000000001, ending_window=5b2808900000084f, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:49:10,747 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@1da5d966identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2729f670{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=57241, starting_window=5b28089000000001, ending_window=5b2808900000084f, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:49:17,603 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:49:19,743 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@6acdbda4{ln=LogicalNode@39857cd8identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6e06de69{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=57704, starting_window=5b28089000000001, ending_window=5b28089000000860, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:49:19,744 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@39857cd8identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6e06de69{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=57704, starting_window=5b28089000000001, ending_window=5b28089000000860, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:49:26,465 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:49:28,737 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@231e52d1{ln=LogicalNode@1043c13didentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5ac96d77{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=58190, starting_window=5b28089000000001, ending_window=5b28089000000872, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:49:28,738 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@1043c13didentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5ac96d77{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=58190, starting_window=5b28089000000001, ending_window=5b28089000000872, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:49:35,482 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:49:37,740 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@17e46e0c{ln=LogicalNode@515b394identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@67ec4d39{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=58676, starting_window=5b28089000000001, ending_window=5b28089000000884, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:49:37,741 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@515b394identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@67ec4d39{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=58676, starting_window=5b28089000000001, ending_window=5b28089000000884, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:49:44,779 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:49:47,242 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@54ccf13d{ln=LogicalNode@5a0888b0identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@235197e1{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=59189, starting_window=5b28089000000001, ending_window=5b28089000000897, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:49:47,243 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@5a0888b0identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@235197e1{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=59189, starting_window=5b28089000000001, ending_window=5b28089000000897, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:49:53,640 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:49:55,742 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@6ea819f3{ln=LogicalNode@46098ea1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@461547bd{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=59671, starting_window=5b28089000000001, ending_window=5b280890000008a9, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:49:55,743 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@46098ea1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@461547bd{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=59671, starting_window=5b28089000000001, ending_window=5b280890000008a9, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:50:03,074 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:50:05,239 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@75ffda44{ln=LogicalNode@6bac8143identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@79d283c9{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=60161, starting_window=5b28089000000001, ending_window=5b280890000008bb, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:50:05,239 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6bac8143identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@79d283c9{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=60161, starting_window=5b28089000000001, ending_window=5b280890000008bb, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:50:12,017 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:50:14,243 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@45e57a44{ln=LogicalNode@4f9398dfidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@28385f84{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=60647, starting_window=5b28089000000001, ending_window=5b280890000008cd, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:50:14,244 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@4f9398dfidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@28385f84{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=60647, starting_window=5b28089000000001, ending_window=5b280890000008cd, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:50:21,079 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:50:23,236 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@44b55693{ln=LogicalNode@4b9c71dfidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6d599901{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=61133, starting_window=5b28089000000001, ending_window=5b280890000008df, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:50:23,237 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@4b9c71dfidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6d599901{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=61133, starting_window=5b28089000000001, ending_window=5b280890000008df, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:50:30,118 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:50:32,241 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@254515b2{ln=LogicalNode@764334a5identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@444a0a35{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=61619, starting_window=5b28089000000001, ending_window=5b280890000008f1, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:50:32,242 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@764334a5identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@444a0a35{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=61619, starting_window=5b28089000000001, ending_window=5b280890000008f1, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:50:39,222 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:50:41,241 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@182ce530{ln=LogicalNode@30966d63identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3b5f0e20{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=62105, starting_window=5b28089000000001, ending_window=5b28089000000903, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:50:41,242 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@30966d63identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3b5f0e20{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=62105, starting_window=5b28089000000001, ending_window=5b28089000000903, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:50:48,262 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:50:50,737 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@3df30646{ln=LogicalNode@52faeb41identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@8f649c0{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=62618, starting_window=5b28089000000001, ending_window=5b28089000000916, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:50:50,740 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@52faeb41identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@8f649c0{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=62618, starting_window=5b28089000000001, ending_window=5b28089000000916, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:50:57,438 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:50:59,736 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@2d00814b{ln=LogicalNode@2e7e1cfdidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@ee5fa27{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=63104, starting_window=5b28089000000001, ending_window=5b28089000000928, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:50:59,737 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@2e7e1cfdidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@ee5fa27{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=63104, starting_window=5b28089000000001, ending_window=5b28089000000928, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:51:06,725 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:51:08,741 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@b9aee42{ln=LogicalNode@ebfac93identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7a2a6c73{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=63590, starting_window=5b28089000000001, ending_window=5b2808900000093a, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:51:08,742 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@ebfac93identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7a2a6c73{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=63590, starting_window=5b28089000000001, ending_window=5b2808900000093a, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:51:15,645 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:51:17,745 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@271c6ed6{ln=LogicalNode@795b073cidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2a965ec2{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=64076, starting_window=5b28089000000001, ending_window=5b2808900000094c, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:51:17,746 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@795b073cidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2a965ec2{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=64076, starting_window=5b28089000000001, ending_window=5b2808900000094c, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:51:24,736 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:51:27,237 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@256f0eab{ln=LogicalNode@56c7e5a1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4a2b2ba3{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=64589, starting_window=5b28089000000001, ending_window=5b2808900000095f, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:51:27,239 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@56c7e5a1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4a2b2ba3{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=64589, starting_window=5b28089000000001, ending_window=5b2808900000095f, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:51:33,898 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:51:36,244 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@12695b71{ln=LogicalNode@332acce7identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4597c558{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=65075, starting_window=5b28089000000001, ending_window=5b28089000000971, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:51:36,245 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@332acce7identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4597c558{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=65075, starting_window=5b28089000000001, ending_window=5b28089000000971, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:51:43,163 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:51:45,243 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@6e6d967f{ln=LogicalNode@7ee8cfeidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@76b0693c{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=65561, starting_window=5b28089000000001, ending_window=5b28089000000983, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:51:45,244 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7ee8cfeidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@76b0693c{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=65561, starting_window=5b28089000000001, ending_window=5b28089000000983, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:51:52,242 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:51:54,738 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@5f84fda1{ln=LogicalNode@3a2c4f42identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1422bc27{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=66074, starting_window=5b28089000000001, ending_window=5b28089000000996, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:51:54,739 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@3a2c4f42identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1422bc27{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=66074, starting_window=5b28089000000001, ending_window=5b28089000000996, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:52:01,221 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:52:03,744 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@3870e816{ln=LogicalNode@4b0a1a9aidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3cc476f4{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=66560, starting_window=5b28089000000001, ending_window=5b280890000009a8, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:52:03,745 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@4b0a1a9aidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3cc476f4{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=66560, starting_window=5b28089000000001, ending_window=5b280890000009a8, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:52:10,849 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:52:13,242 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@331b9faf{ln=LogicalNode@6b9580d1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@18ddb9b{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=67096, starting_window=5b28089000000001, ending_window=5b280890000009bc, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:52:13,243 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6b9580d1identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@18ddb9b{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=67096, starting_window=5b28089000000001, ending_window=5b280890000009bc, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:52:19,777 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:52:22,237 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@541ae5ed{ln=LogicalNode@4f5feb3eidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@40c87fa0{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=67582, starting_window=5b28089000000001, ending_window=5b280890000009ce, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:52:22,238 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@4f5feb3eidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@40c87fa0{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=67582, starting_window=5b28089000000001, ending_window=5b280890000009ce, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:52:29,156 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:52:31,242 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@20d6e39d{ln=LogicalNode@67827c5eidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@40cf6459{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=68045, starting_window=5b28089000000001, ending_window=5b280890000009df, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:52:31,243 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@67827c5eidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@40cf6459{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=68045, starting_window=5b28089000000001, ending_window=5b280890000009df, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:52:37,726 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:52:40,236 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@64b4d8ce{ln=LogicalNode@5402d97didentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@579fb961{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=68531, starting_window=5b28089000000001, ending_window=5b280890000009f1, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:52:40,237 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@5402d97didentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@579fb961{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=68531, starting_window=5b28089000000001, ending_window=5b280890000009f1, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:52:47,273 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:52:49,744 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@435f126b{ln=LogicalNode@37cbe6d2identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7d4a133e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=69044, starting_window=5b28089000000001, ending_window=5b28089000000a04, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:52:49,745 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@37cbe6d2identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7d4a133e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=69044, starting_window=5b28089000000001, ending_window=5b28089000000a04, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:52:56,024 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:52:58,236 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@75fb3a4d{ln=LogicalNode@3770ccbeidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6133123e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=69503, starting_window=5b28089000000001, ending_window=5b28089000000a15, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:52:58,237 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@3770ccbeidentifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6133123e{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=69503, starting_window=5b28089000000001, ending_window=5b28089000000a15, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:53:05,330 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:53:07,736 ERROR com.datatorrent.netlet.WriteOnlyClient: Disconnecting Subscriber@c874c71{ln=LogicalNode@6412c645identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2309c9cd{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=70016, starting_window=5b28089000000001, ending_window=5b28089000000a28, refCount=2, uniqueIdentifier=0, next=null, future=null}}}} because of an exception. java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:51) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at com.datatorrent.netlet.WriteOnlyClient.channelWrite(WriteOnlyClient.java:187) at com.datatorrent.netlet.WriteOnlyLengthPrependerClient.write(WriteOnlyLengthPrependerClient.java:75) at com.datatorrent.netlet.DefaultEventLoop.handleSelectedKey(DefaultEventLoop.java:387) at com.datatorrent.netlet.OptimizedEventLoop$SelectedSelectionKeySet.forEach(OptimizedEventLoop.java:59) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:192) at com.datatorrent.netlet.OptimizedEventLoop.runEventLoop(OptimizedEventLoop.java:157) at com.datatorrent.netlet.DefaultEventLoop.run(DefaultEventLoop.java:175) at java.lang.Thread.run(Thread.java:748) 2018-06-18 19:53:07,737 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6412c645identifier=tcp://laptop-name:41209/1.out.1, upstream=1.out.1, group=aggregate/2.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2309c9cd{da=com.datatorrent.bufferserver.internal.DataList$Block@3b2f015b{identifier=1.out.1, data=67108864, readingOffset=0, writingOffset=70016, starting_window=5b28089000000001, ending_window=5b28089000000a28, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@289952b2[identifier=1.out.1] 2018-06-18 19:53:14,354 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41209/1.out.1, windowId=ffffffffffffffff, type=aggregate/2.input, upstreamIdentifier=1.out.1, mask=0, partitions=null, bufferSize=1024} End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000135 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22165 Log Contents: 2018-06-18 19:51:40,620 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:51:41,906 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000135/tmp as the basepath for spooling. 2018-06-18 19:51:41,910 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:45501 2018-06-18 19:51:43,006 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:51:43,142 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:51:43,210 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:45501/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:51:43,226 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000135/tmp/chkp6310066077406516651 as the basepath for checkpointing. 2018-06-18 19:51:43,236 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:51:43,365 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@1e664cfc for node 2 2018-06-18 19:51:43,387 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:51:43,388 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746420_5596 2018-06-18 19:51:43,390 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:51:45,171 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:51:45,174 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:51:45,178 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@355a1ffidentifier=tcp://laptop-name:45501/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@31040453{da=com.datatorrent.bufferserver.internal.DataList$Block@2357406f{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=572, starting_window=5b28089000000001, ending_window=5b28089000000008, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@774398db[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000102 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22165 Log Contents: 2018-06-18 19:46:37,099 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:46:38,413 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000102/tmp as the basepath for spooling. 2018-06-18 19:46:38,418 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:35187 2018-06-18 19:46:39,516 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:46:39,622 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:46:39,697 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000102/tmp/chkp1109989534824260563 as the basepath for checkpointing. 2018-06-18 19:46:39,703 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:46:39,718 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:35187/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:46:39,824 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@2d19c7f7 for node 2 2018-06-18 19:46:39,835 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:46:39,835 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745783_4959 2018-06-18 19:46:39,839 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:46:41,639 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:46:41,642 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:46:41,646 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@3fe979b3identifier=tcp://laptop-name:35187/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2f6a428f{da=com.datatorrent.bufferserver.internal.DataList$Block@295033f8{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=278, starting_window=5b28089000000001, ending_window=5b28089000000005, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@d36eb9c[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000061 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22166 Log Contents: 2018-06-18 19:40:23,212 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:40:24,428 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000061/tmp as the basepath for spooling. 2018-06-18 19:40:24,431 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:34539 2018-06-18 19:40:25,505 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:40:25,595 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:40:25,669 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000061/tmp/chkp5886571917813869742 as the basepath for checkpointing. 2018-06-18 19:40:25,679 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:40:25,795 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@1a859e25 for node 2 2018-06-18 19:40:25,813 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:40:25,814 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744999_4175 2018-06-18 19:40:25,817 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:40:25,899 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:34539/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:40:27,623 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:40:27,625 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:40:27,629 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@76e454a4identifier=tcp://laptop-name:34539/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@485efba7{da=com.datatorrent.bufferserver.internal.DataList$Block@5f7a9420{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=947, starting_window=5b28089000000001, ending_window=5b2808900000000b, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@5f13c43d[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000028 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23945 Log Contents: 2018-06-18 19:35:22,404 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:35:23,585 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000028/tmp as the basepath for spooling. 2018-06-18 19:35:23,590 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:41533 2018-06-18 19:35:24,677 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:35:24,731 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41533/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:35:24,833 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:35:24,914 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000028/tmp/chkp2800290828431191961 as the basepath for checkpointing. 2018-06-18 19:35:24,922 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:35:25,040 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@a40a209 for node 2 2018-06-18 19:35:25,057 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:35:25,057 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744368_3544 2018-06-18 19:35:25,059 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:35:25,060 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:35:26,852 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:35:26,856 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:35:26,859 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@30a7e3a3identifier=tcp://laptop-name:41533/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1b53ddf0{da=com.datatorrent.bufferserver.internal.DataList$Block@3489dce8{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=367, starting_window=5b28089000000001, ending_window=5b28089000000006, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@146f8a8f[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000127 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20914 Log Contents: 2018-06-18 19:50:27,779 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:50:28,934 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000127/tmp as the basepath for spooling. 2018-06-18 19:50:28,938 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:33551 2018-06-18 19:50:30,009 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:50:30,099 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:50:30,170 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000127/tmp/chkp4616984547531325759 as the basepath for checkpointing. 2018-06-18 19:50:30,177 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:50:30,296 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@31857abe for node 2 2018-06-18 19:50:30,398 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:33551/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:50:30,425 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:50:30,426 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:50:32,123 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:50:32,126 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:50:32,130 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@2840384didentifier=tcp://laptop-name:33551/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5d3c014a{da=com.datatorrent.bufferserver.internal.DataList$Block@1229d87{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=572, starting_window=5b28089000000001, ending_window=5b28089000000008, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@3a6c7198[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000094 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23944 Log Contents: 2018-06-18 19:45:24,115 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:45:25,308 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000094/tmp as the basepath for spooling. 2018-06-18 19:45:25,311 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:38047 2018-06-18 19:45:26,384 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:45:26,471 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:45:26,544 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000094/tmp/chkp250688633816015908 as the basepath for checkpointing. 2018-06-18 19:45:26,549 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:45:26,671 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@5f7f8bb2 for node 2 2018-06-18 19:45:26,675 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:45:26,676 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745628_4804 2018-06-18 19:45:26,678 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:45:26,679 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:45:26,861 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:38047/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:45:28,488 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:45:28,491 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:45:28,496 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@22e15cb5identifier=tcp://laptop-name:38047/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6d6d2c2{da=com.datatorrent.bufferserver.internal.DataList$Block@11f9f57b{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=278, starting_window=5b28089000000001, ending_window=5b28089000000005, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@57b509ee[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000053 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20916 Log Contents: 2018-06-18 19:39:10,276 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:39:11,438 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000053/tmp as the basepath for spooling. 2018-06-18 19:39:11,442 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:37073 2018-06-18 19:39:12,546 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:39:12,663 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:39:12,735 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000053/tmp/chkp7121878711087082283 as the basepath for checkpointing. 2018-06-18 19:39:12,746 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:39:12,863 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@14b8503a for node 2 2018-06-18 19:39:12,975 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:39:12,975 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:39:12,986 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:37073/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:39:14,684 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:39:14,685 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:39:14,687 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6d6af294identifier=tcp://laptop-name:37073/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5a817f41{da=com.datatorrent.bufferserver.internal.DataList$Block@1e2473c2{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1242, starting_window=5b28089000000001, ending_window=5b2808900000000d, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@7b86945e[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000020 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:34:09,437 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:34:10,636 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000020/tmp as the basepath for spooling. 2018-06-18 19:34:10,640 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:40743 2018-06-18 19:34:11,743 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:34:11,792 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:40743/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:34:11,882 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:34:11,942 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000020/tmp/chkp8150989938442125325 as the basepath for checkpointing. 2018-06-18 19:34:11,954 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:34:12,069 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@7a42b3cd for node 2 2018-06-18 19:34:12,073 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:34:12,073 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744217_3393 2018-06-18 19:34:12,076 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:34:12,077 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:34:13,912 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:34:13,914 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:34:13,918 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@36d6f48fidentifier=tcp://laptop-name:40743/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@340f8c0f{da=com.datatorrent.bufferserver.internal.DataList$Block@35f59995{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=813, starting_window=5b28089000000001, ending_window=5b2808900000000a, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@735c99c3[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000119 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20817 Log Contents: 2018-06-18 19:49:14,918 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:49:16,309 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000119/tmp as the basepath for spooling. 2018-06-18 19:49:16,314 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:45669 2018-06-18 19:49:17,387 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:49:17,498 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:45669/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:49:17,600 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:49:17,738 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000119/tmp/chkp7938207934503707627 as the basepath for checkpointing. 2018-06-18 19:49:17,759 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:49:17,880 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@71d02b4f for node 2 2018-06-18 19:49:17,929 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.addBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:418) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.addBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1455) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1251) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:49:19,634 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:49:19,635 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:49:19,638 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@12eb85e0identifier=tcp://laptop-name:45669/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4d701d40{da=com.datatorrent.bufferserver.internal.DataList$Block@857d17d{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=198, starting_window=5b28089000000001, ending_window=5b28089000000004, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@420d0ac8[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000086 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23947 Log Contents: 2018-06-18 19:44:11,141 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:44:12,323 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000086/tmp as the basepath for spooling. 2018-06-18 19:44:12,327 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:37751 2018-06-18 19:44:13,415 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:44:13,515 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:44:13,534 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:37751/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:44:13,593 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000086/tmp/chkp7055143576694248842 as the basepath for checkpointing. 2018-06-18 19:44:13,606 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:44:13,721 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@6c28d48f for node 2 2018-06-18 19:44:13,722 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:44:13,723 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745477_4653 2018-06-18 19:44:13,725 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:44:13,726 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:44:15,546 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:44:15,550 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:44:15,554 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@69bdc29fidentifier=tcp://laptop-name:37751/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3cd626f3{da=com.datatorrent.bufferserver.internal.DataList$Block@66bb5f4b{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1943, starting_window=5b28089000000001, ending_window=5b28089000000011, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@16a58456[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000045 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20915 Log Contents: 2018-06-18 19:37:57,503 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:37:58,696 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000045/tmp as the basepath for spooling. 2018-06-18 19:37:58,701 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:39349 2018-06-18 19:37:59,777 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:37:59,866 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:37:59,946 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000045/tmp/chkp363512898461517777 as the basepath for checkpointing. 2018-06-18 19:37:59,960 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:38:00,073 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@54075859 for node 2 2018-06-18 19:38:00,174 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:39349/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:38:00,195 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:38:00,195 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:38:01,902 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:38:01,905 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:38:01,910 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@447b91deidentifier=tcp://laptop-name:39349/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@23058507{da=com.datatorrent.bufferserver.internal.DataList$Block@6372a670{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1754, starting_window=5b28089000000001, ending_window=5b28089000000010, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@2b33a76b[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000012 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:19006 Log Contents: 2018-06-18 19:32:56,334 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:32:57,515 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000012/tmp as the basepath for spooling. 2018-06-18 19:32:57,519 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:46031 2018-06-18 19:32:58,632 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:32:58,706 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:46031/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:32:58,723 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:32:58,808 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:33:00,767 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:33:00,770 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:33:00,774 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@4c802cd8identifier=tcp://laptop-name:46031/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3e4792f0{da=com.datatorrent.bufferserver.internal.DataList$Block@34590222{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=12, starting_window=5b28089000000001, ending_window=5b28089000000001, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@5c8b2382[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000144 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23944 Log Contents: 2018-06-18 19:53:02,771 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:53:04,090 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000144/tmp as the basepath for spooling. 2018-06-18 19:53:04,094 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:40071 2018-06-18 19:53:05,188 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:53:05,311 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:53:05,383 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000144/tmp/chkp8763071012821796633 as the basepath for checkpointing. 2018-06-18 19:53:05,387 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:53:05,508 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@5b217f41 for node 2 2018-06-18 19:53:05,519 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:53:05,520 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746592_5768 2018-06-18 19:53:05,522 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:53:05,523 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:53:05,572 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:40071/2.out.1, windowId=5b28089000000032, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:53:07,335 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:53:07,337 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:53:07,342 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@328d8d94identifier=tcp://laptop-name:40071/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5b0d42aa{da=com.datatorrent.bufferserver.internal.DataList$Block@6a63db83{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=65, starting_window=5b28089000000001, ending_window=5b28089000000002, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@76ccf27[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000111 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22166 Log Contents: 2018-06-18 19:48:00,919 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:48:02,146 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000111/tmp as the basepath for spooling. 2018-06-18 19:48:02,150 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:43497 2018-06-18 19:48:03,275 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:48:03,407 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:48:03,478 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000111/tmp/chkp7335646682470750172 as the basepath for checkpointing. 2018-06-18 19:48:03,483 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:48:03,602 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@57a41695 for node 2 2018-06-18 19:48:03,618 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:43497/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:48:03,620 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:48:03,621 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745958_5134 2018-06-18 19:48:03,628 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:48:05,433 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:48:05,436 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:48:05,442 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@4c1ea263identifier=tcp://laptop-name:43497/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3e415e7b{da=com.datatorrent.bufferserver.internal.DataList$Block@129775a8{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=367, starting_window=5b28089000000001, ending_window=5b28089000000006, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@431543c7[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000078 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:42:58,191 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:42:59,385 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000078/tmp as the basepath for spooling. 2018-06-18 19:42:59,390 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:38015 2018-06-18 19:43:00,485 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:43:00,587 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:43:00,660 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000078/tmp/chkp5756509108612802068 as the basepath for checkpointing. 2018-06-18 19:43:00,669 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:43:00,695 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:38015/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:43:00,788 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@1f491aeb for node 2 2018-06-18 19:43:00,805 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:43:00,805 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745324_4500 2018-06-18 19:43:00,807 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:43:00,808 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:43:02,617 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:43:02,620 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:43:02,624 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7b83f61eidentifier=tcp://laptop-name:38015/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@295101e9{da=com.datatorrent.bufferserver.internal.DataList$Block@396627b0{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=947, starting_window=5b28089000000001, ending_window=5b2808900000000b, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@70cc3577[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000037 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23944 Log Contents: 2018-06-18 19:36:44,497 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:36:45,670 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000037/tmp as the basepath for spooling. 2018-06-18 19:36:45,673 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:39841 2018-06-18 19:36:46,750 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:39841/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:36:46,755 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:36:46,850 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:36:46,910 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000037/tmp/chkp7641466068493509516 as the basepath for checkpointing. 2018-06-18 19:36:46,916 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:36:47,034 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@dcc87df for node 2 2018-06-18 19:36:47,035 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:36:47,036 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744540_3716 2018-06-18 19:36:47,038 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:36:47,039 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:36:48,880 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:36:48,883 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:36:48,888 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@9d0bd3aidentifier=tcp://laptop-name:39841/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@71efebab{da=com.datatorrent.bufferserver.internal.DataList$Block@167c6e54{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=278, starting_window=5b28089000000001, ending_window=5b28089000000005, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@64eb6f2f[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000004 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:218859 Log Contents: 2018-06-18 19:31:42,815 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:31:44,317 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000004/tmp as the basepath for spooling. 2018-06-18 19:31:44,321 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:43545 2018-06-18 19:31:45,400 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:31:49,536 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:31:49,546 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:31:50,558 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:31:51,067 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:31:51,571 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:31:52,077 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:31:52,580 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:31:53,086 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:31:53,594 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:31:54,101 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:31:54,608 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b2808900000000c, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:31:58,653 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:31:58,662 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:31:59,671 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:00,180 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:00,683 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:01,188 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:01,693 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:02,196 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:02,698 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:03,205 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:03,710 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b2808900000000d, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:32:07,742 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:32:07,753 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:32:08,760 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:09,266 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:09,772 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:10,278 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:10,780 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:11,282 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:11,789 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:12,294 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:12,799 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:13,305 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b2808900000000d, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:32:17,336 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:32:17,346 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:32:18,355 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:18,861 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:19,363 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:19,866 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:20,368 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:20,870 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:21,375 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:21,880 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:22,384 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000011, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:32:26,411 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:32:26,414 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:32:27,422 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:27,928 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:28,430 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:28,935 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:29,437 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:29,940 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:30,444 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:30,949 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:31,453 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000011, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:32:35,495 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:32:35,497 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:32:36,506 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:37,011 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:37,513 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:38,015 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:38,517 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:39,023 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:39,528 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:40,032 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:40,537 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000011, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:32:44,577 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:32:44,583 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:32:45,594 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:46,100 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:46,602 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:47,106 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:47,609 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:48,113 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:48,615 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:49,119 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:49,624 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:50,126 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:32:54,150 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:32:54,154 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:32:55,161 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:55,664 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:56,169 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:56,671 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:57,174 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:57,678 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:58,182 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:32:58,687 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:33:02,724 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:33:02,726 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:33:03,732 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:04,237 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:04,740 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:05,250 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:05,758 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:06,261 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:06,762 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:07,266 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:07,770 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:08,274 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:33:12,302 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:33:12,309 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:33:13,315 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:13,819 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:14,323 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:14,827 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:15,329 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:15,831 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:16,335 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:16,839 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:17,344 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:33:21,370 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:33:21,377 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:33:22,383 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:22,888 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:23,390 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:23,892 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:24,394 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:24,896 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:25,399 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:25,903 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:26,407 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:33:30,429 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:33:30,439 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:33:31,445 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:31,949 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:32,450 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:32,952 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:33,455 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:33,957 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:34,461 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:34,966 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:35,479 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:33:39,506 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:33:39,511 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:33:40,517 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:41,022 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:41,524 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:42,026 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:42,528 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:43,030 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:43,533 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:44,037 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:44,540 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:33:48,567 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:33:48,576 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:33:49,582 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:50,088 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:50,589 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:51,106 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:51,609 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:52,110 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:52,613 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:53,117 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:53,619 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:33:57,645 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:33:57,649 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:33:58,656 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:59,161 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:33:59,665 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:00,167 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:00,668 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:01,170 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:01,674 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:02,178 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:02,682 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:34:06,718 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:34:06,727 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:34:07,733 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:08,237 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:08,739 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:09,243 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:09,745 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:10,246 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:10,748 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:11,252 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:11,760 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:34:15,799 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:34:15,805 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:34:16,812 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:17,314 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:17,816 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:18,318 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:18,820 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:19,322 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:19,824 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:20,327 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:20,832 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:34:24,852 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:34:24,861 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:34:25,867 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:26,371 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:26,873 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:27,379 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:27,881 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:28,383 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:28,886 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:29,389 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:29,893 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:30,395 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:34:34,413 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:34:34,423 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:34:35,428 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:35,930 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:36,432 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:36,933 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:37,435 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:37,937 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:38,440 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:38,944 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:39,448 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:34:43,473 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:34:43,482 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:34:44,489 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:44,993 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:45,495 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:45,999 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:46,501 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:47,002 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:47,505 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:48,009 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:48,511 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:34:52,529 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:34:52,536 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:34:53,543 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:54,047 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:54,549 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:55,051 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:55,552 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:56,054 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:56,557 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:57,058 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:34:57,562 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:35:01,585 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:35:01,588 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:35:02,592 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:03,096 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:03,598 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:04,099 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:04,600 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:05,102 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:05,605 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:06,108 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:06,612 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:35:10,633 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:35:10,643 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:35:11,648 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:12,151 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:12,652 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:13,153 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:13,655 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:14,156 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:14,659 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:15,162 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:15,664 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:35:19,679 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:35:19,688 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:35:20,693 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:21,197 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:21,698 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:22,199 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:22,701 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:23,202 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:23,704 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:24,207 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:24,711 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:35:28,735 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:35:28,736 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:35:29,740 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:30,243 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:30,744 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:31,248 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:31,749 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:32,251 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:32,752 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:33,255 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:33,762 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:35:37,785 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:35:37,796 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:35:38,801 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:39,304 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:39,805 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:40,307 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:40,808 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:41,310 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:41,812 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:42,315 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:42,818 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:43,322 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:35:47,355 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:35:47,363 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:35:48,369 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:48,872 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:49,374 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:49,876 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:50,378 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:50,879 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:51,382 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:51,885 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:52,387 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:35:56,409 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:35:56,415 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:35:57,420 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:57,924 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:58,425 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:58,927 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:59,428 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:35:59,929 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:00,432 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:00,934 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:01,436 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:36:05,454 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:36:05,465 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:36:06,470 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:06,974 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:07,475 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:07,977 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:08,479 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:08,980 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:09,483 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:09,986 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:10,489 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:36:14,517 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:36:14,524 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:36:15,529 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:16,032 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:16,534 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:17,036 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:17,537 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:18,039 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:18,541 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:19,044 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:19,546 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:36:23,564 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:36:23,568 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:36:24,573 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:25,076 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:25,578 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:26,079 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:26,581 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:27,082 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:27,584 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:28,087 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:28,595 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:36:32,615 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:36:32,623 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:36:33,628 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:34,131 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:34,633 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:35,134 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:35,636 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:36,137 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:36,638 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:37,141 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:37,652 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:36:41,687 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:36:41,697 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:36:42,702 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:43,205 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:43,706 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:44,208 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:44,715 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:45,217 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:45,718 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:46,221 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:46,729 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:36:50,753 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:36:50,755 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:36:51,760 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:52,263 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:52,768 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:53,269 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:53,771 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:54,272 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:54,774 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:55,276 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:55,779 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:36:56,282 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:37:00,302 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:37:00,309 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:37:01,313 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:01,816 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:02,318 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:02,819 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:03,320 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:03,822 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:04,325 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:04,827 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:05,330 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:37:09,347 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:37:09,355 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:37:10,360 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:10,864 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:11,365 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:11,867 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:12,368 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:12,870 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:13,372 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:13,875 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:14,377 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:37:18,400 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:37:18,404 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:37:19,409 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:19,912 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:20,413 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:20,914 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:21,416 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:21,917 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:22,419 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:22,922 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:23,424 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000013, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:37:27,451 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:37:27,454 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:37:28,459 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:28,962 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:29,463 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:29,965 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:30,466 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:30,967 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:31,469 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:31,972 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:32,474 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000016, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:37:36,490 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:37:36,495 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:37:37,499 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:38,002 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:38,503 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:39,005 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:39,506 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:40,007 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:40,510 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:41,013 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:41,515 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000016, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:37:45,541 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:37:45,544 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:37:46,548 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:47,052 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:47,553 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:48,055 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:48,556 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:49,057 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:49,559 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:50,061 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:50,565 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:37:54,592 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:37:54,597 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:37:55,601 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:56,104 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:56,606 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:57,107 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:57,609 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:58,110 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:58,611 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:59,113 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:37:59,616 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:00,120 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:38:04,151 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:38:04,153 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:38:05,158 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:05,659 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:06,160 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:06,662 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:07,163 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:07,664 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:08,167 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:08,669 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:09,171 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:38:13,185 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:38:13,189 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:38:14,193 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:14,696 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:15,198 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:15,699 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:16,200 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:16,701 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:17,204 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:17,706 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:18,208 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:38:22,224 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:38:22,227 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:38:23,231 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:23,734 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:24,235 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:24,736 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:25,238 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:25,739 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:26,241 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:26,743 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:27,245 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:38:31,263 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:38:31,272 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:38:32,276 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:32,778 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:33,279 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:33,782 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:34,283 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:34,784 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:35,286 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:35,789 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:36,291 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:38:40,305 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:38:40,306 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:38:41,310 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:41,813 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:42,315 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:42,816 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:43,317 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:43,818 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:44,321 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:44,823 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:45,325 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:38:49,347 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:38:49,354 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:38:50,358 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:50,861 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:51,362 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:51,863 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:52,365 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:52,866 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:53,369 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:53,872 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:54,374 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:38:58,408 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:38:58,412 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:38:59,416 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:38:59,919 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:00,420 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:00,921 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:01,423 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:01,924 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:02,425 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:02,928 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:03,430 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:03,933 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:39:07,952 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:39:07,960 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:39:08,964 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:09,465 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:09,967 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:10,468 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:10,969 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:11,470 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:11,973 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:12,475 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:12,977 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:39:16,998 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:39:17,007 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:39:18,011 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:18,514 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:19,015 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:19,517 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:20,018 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:20,519 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:21,022 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:21,524 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:22,026 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:39:26,051 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:39:26,059 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:39:27,064 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:27,566 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:28,067 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:28,569 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:29,070 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:29,571 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:30,073 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:30,576 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:31,078 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:39:35,095 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:39:35,095 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:39:36,100 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:36,603 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:37,105 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:37,606 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:38,107 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:38,608 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:39,111 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:39,613 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:40,116 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:39:44,144 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:39:44,151 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:39:45,155 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:45,658 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:46,160 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:46,661 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:47,162 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:47,663 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:48,166 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:48,668 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:49,170 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:39:53,188 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:39:53,190 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:39:54,194 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:54,697 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:55,199 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:55,700 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:56,202 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:56,704 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:57,206 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:57,708 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:39:58,210 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:40:02,253 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:40:02,257 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:40:03,271 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:03,774 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:04,275 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:04,776 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:05,278 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:05,780 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:06,281 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:06,784 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:07,287 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:07,790 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:40:11,810 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:40:11,817 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:40:12,822 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:13,324 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:13,825 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:14,326 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:14,827 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:15,329 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:15,831 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:16,334 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:16,838 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:40:20,875 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:40:20,880 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:40:21,883 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:22,385 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:22,886 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:23,387 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:23,888 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:24,389 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:24,892 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:25,393 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:25,895 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:40:29,910 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:40:29,916 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:40:30,920 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:31,422 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:31,923 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:32,424 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:32,925 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:33,426 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:33,929 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:34,432 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:34,934 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:40:38,952 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:40:38,961 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:40:39,965 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:40,469 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:40,970 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:41,471 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:41,972 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:42,473 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:42,976 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:43,478 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:43,980 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:40:48,008 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:40:48,011 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:40:49,015 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:49,518 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:50,020 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:50,521 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:51,022 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:51,525 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:52,026 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:52,528 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:53,033 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:40:57,061 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:40:57,071 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:40:58,075 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:58,578 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:59,079 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:40:59,580 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:00,081 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:00,583 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:01,085 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:01,587 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:02,089 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:41:06,107 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:41:06,112 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:41:07,116 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:07,619 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:08,120 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:08,622 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:09,125 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:09,626 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:10,127 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:10,630 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:11,132 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:11,634 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:41:15,653 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:41:15,662 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:41:16,666 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:17,167 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:17,668 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:18,170 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:18,671 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:19,172 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:19,673 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:20,176 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:20,679 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:41:24,700 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:41:24,702 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:41:25,706 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:26,208 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:26,710 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:27,211 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:27,712 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:28,213 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:28,715 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:29,218 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:29,721 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:41:33,743 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:41:33,750 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:41:34,754 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:35,257 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:35,759 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:36,260 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:36,761 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:37,262 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:37,765 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:38,267 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:38,769 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:41:42,801 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:41:42,812 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:41:43,817 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:44,320 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:44,821 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:45,322 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:45,823 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:46,324 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:46,827 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:47,329 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:47,831 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:41:51,857 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:41:51,861 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:41:52,865 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:53,368 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:53,870 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:54,371 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:54,872 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:55,373 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:55,875 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:56,377 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:41:56,879 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:42:00,905 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:42:00,912 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:42:01,917 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:02,419 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:02,921 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:03,422 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:03,923 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:04,427 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:04,929 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:05,431 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:05,933 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:42:09,952 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:42:09,961 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:42:10,965 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:11,468 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:11,969 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:12,472 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:12,974 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:13,475 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:13,977 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:14,480 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:14,981 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:42:18,994 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:42:19,004 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:42:20,008 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:20,510 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:21,012 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:21,513 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:22,014 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:22,515 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:23,017 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:23,519 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:24,023 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:42:28,043 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:42:28,048 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:42:29,052 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:29,555 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:30,057 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:30,560 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:31,061 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:31,562 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:32,064 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:32,566 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:33,069 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:33,571 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:42:37,590 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:42:37,592 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:42:38,596 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:39,098 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:39,600 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:40,101 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:40,602 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:41,103 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:41,606 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:42,108 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:42,611 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:42:46,629 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:42:46,632 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:42:47,636 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:48,138 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:48,640 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:49,141 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:49,642 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:50,143 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:50,645 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:51,147 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:51,649 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:42:55,667 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:42:55,672 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:42:56,676 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:57,179 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:57,680 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:58,181 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:58,682 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:59,183 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:42:59,686 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:00,188 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:00,690 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:43:04,706 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:43:04,710 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:43:05,714 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:06,217 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:06,718 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:07,219 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:07,720 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:08,221 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:08,723 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:09,226 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:09,727 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:43:13,742 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:43:13,752 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:43:14,756 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:15,259 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:15,761 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:16,264 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:16,766 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:17,267 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:17,768 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:18,270 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:18,772 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:19,274 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:19,776 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:43:22,799 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:43:22,800 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:43:23,803 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:24,306 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:24,807 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:25,308 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:25,809 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:26,310 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:26,811 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:27,313 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:27,815 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:28,318 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:43:32,334 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:43:32,340 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:43:33,343 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:33,844 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:34,345 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:34,846 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:35,347 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:35,848 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:36,350 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:36,852 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:37,355 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:43:41,374 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:43:41,383 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:43:42,387 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:42,890 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:43,392 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:43,893 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:44,394 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:44,895 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:45,397 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:45,899 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:46,401 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:43:50,421 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:43:50,431 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:43:51,435 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:51,937 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:52,438 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:52,940 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:53,441 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:53,943 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:54,445 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:54,946 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:43:55,448 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:43:59,466 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:43:59,468 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:44:00,472 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:00,975 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:01,476 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:01,977 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:02,478 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:02,979 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:03,482 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:03,984 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:04,486 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:44:08,503 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:44:08,506 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:44:09,507 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:10,010 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:10,511 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:11,012 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:11,513 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:12,014 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:12,516 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:13,018 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:13,520 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:44:17,544 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:44:17,545 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:44:18,549 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:19,051 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:19,553 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:20,054 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:20,555 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:21,056 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:21,557 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:22,059 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:22,563 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:44:26,596 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:44:26,599 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:44:27,603 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:28,105 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:28,606 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:29,107 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:29,610 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:30,111 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:30,612 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:31,114 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:31,616 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:32,118 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:44:36,130 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:44:36,131 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:44:37,134 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:37,636 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:38,137 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:38,638 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:39,139 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:39,640 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:40,142 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:40,644 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:41,146 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:44:45,162 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:44:45,167 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:44:46,171 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:46,674 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:47,175 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:47,677 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:48,178 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:48,679 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:49,181 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:49,683 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:50,184 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:44:54,197 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:44:54,207 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:44:55,211 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:55,714 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:56,215 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:56,716 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:57,217 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:57,719 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:58,221 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:58,723 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:44:59,225 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:45:03,241 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:45:03,245 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:45:04,249 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:04,752 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:05,253 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:05,755 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:06,256 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:06,757 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:07,259 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:07,761 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:08,263 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:45:12,278 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:45:12,280 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:45:13,284 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:13,786 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:14,287 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:14,790 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:15,292 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:15,793 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:16,294 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:16,796 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:17,300 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:45:21,325 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:45:21,330 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:45:22,336 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:22,838 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:23,339 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:23,841 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:24,342 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:24,844 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:25,344 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:25,847 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:26,350 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:26,853 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:45:30,870 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:45:30,876 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:45:31,880 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:32,381 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:32,882 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:33,384 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:33,885 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:34,386 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:34,888 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:35,391 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:35,895 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:45:39,917 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:45:39,924 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:45:40,927 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:41,429 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:41,930 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:42,432 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:42,932 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:43,434 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:43,936 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:44,438 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:44,940 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:45:48,951 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:45:48,955 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:45:49,959 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:50,462 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:50,963 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:51,464 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:51,965 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:52,467 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:52,969 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:53,471 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:53,973 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:45:57,997 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:45:58,001 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:45:59,006 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:45:59,508 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:00,009 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:00,510 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:01,012 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:01,513 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:02,015 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:02,517 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:03,019 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:46:07,037 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:46:07,042 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:46:08,046 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:08,549 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:09,051 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:09,552 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:10,053 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:10,554 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:11,056 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:11,558 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:12,059 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:46:16,076 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:46:16,079 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:46:17,084 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:17,587 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:18,089 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:18,590 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:19,091 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:19,592 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:20,093 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:20,595 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:21,097 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:21,599 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:46:25,611 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:46:25,622 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:46:26,627 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:27,128 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:27,629 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:28,130 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:28,632 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:29,133 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:29,634 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:30,136 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:30,638 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:46:34,653 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:46:34,661 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:46:35,672 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:36,173 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:36,675 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:37,176 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:37,684 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:38,185 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:38,687 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:39,189 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:39,691 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:46:43,705 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:46:43,714 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:46:44,716 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:45,217 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:45,719 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:46,220 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:46,721 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:47,223 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:47,724 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:48,226 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:48,728 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:46:52,740 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:46:52,745 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:46:53,750 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:54,251 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:54,752 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:55,253 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:55,754 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:56,255 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:56,757 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:57,259 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:46:57,760 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:47:01,779 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:47:01,783 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:47:02,786 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:03,288 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:03,790 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:04,291 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:04,792 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:05,293 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:05,795 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:06,297 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:06,798 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:47:10,811 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:47:10,819 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:47:11,822 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:12,324 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:12,825 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:13,327 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:13,828 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:14,329 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:14,831 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:15,333 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:15,838 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:47:19,856 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:47:19,862 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:47:20,866 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:21,368 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:21,870 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:22,371 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:22,872 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:23,373 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:23,874 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:24,377 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:24,879 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:25,381 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:47:29,419 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:47:29,428 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:47:30,431 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:30,933 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:31,434 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:31,935 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:32,436 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:32,938 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:33,439 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:33,941 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:34,442 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:34,944 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:35,446 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:47:39,475 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:47:39,476 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:47:40,480 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:40,981 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:41,483 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:41,985 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:42,486 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:42,987 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:43,489 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:43,990 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:44,492 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:44,995 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:45,499 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:47:49,549 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:47:49,549 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:47:50,551 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:51,052 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:51,553 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:52,054 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:52,556 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:53,060 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:53,562 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:54,064 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:54,567 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:55,068 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:47:58,085 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:47:58,094 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:47:59,097 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:47:59,600 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:00,101 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:00,602 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:01,103 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:01,605 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:02,106 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:02,608 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:03,110 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:03,611 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:48:07,628 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:48:07,638 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:48:08,641 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:09,143 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:09,644 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:10,145 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:10,647 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:11,148 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:11,650 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:12,152 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:12,654 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:48:16,671 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:48:16,678 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:48:17,682 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:18,184 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:18,685 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:19,186 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:19,687 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:20,188 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:20,690 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:21,193 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:21,694 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:48:25,712 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:48:25,723 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:48:26,726 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:27,228 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:27,729 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:28,230 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:28,732 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:29,233 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:29,735 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:30,237 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:30,738 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:48:34,757 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:48:34,765 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:48:35,768 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:36,270 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:36,772 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:37,273 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:37,774 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:38,275 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:38,777 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:39,279 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:39,781 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:48:43,800 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:48:43,804 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:48:44,808 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:45,310 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:45,812 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:46,313 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:46,814 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:47,317 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:47,819 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:48,320 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:48,822 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:48:52,836 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:48:52,837 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:48:53,841 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:54,343 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:54,844 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:55,345 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:55,847 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:56,350 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:56,851 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:57,353 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:57,856 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:58,359 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:48:58,861 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:49:02,908 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:49:02,910 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:49:03,914 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:04,417 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:04,918 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:05,419 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:05,920 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:06,422 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:06,922 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:07,425 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:07,927 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:08,428 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:49:12,441 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:49:12,444 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:49:13,447 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:13,949 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:14,451 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:14,953 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:15,454 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:15,955 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:16,457 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:16,959 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:17,461 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:49:21,518 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:49:21,520 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:49:22,523 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:23,026 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:23,527 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:24,028 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:24,529 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:25,030 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:25,532 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:26,034 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:26,536 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:49:30,550 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:49:30,557 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:49:31,560 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:32,063 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:32,564 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:33,065 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:33,569 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:34,070 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:34,572 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:35,075 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:35,576 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:49:39,596 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:49:39,602 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:49:40,607 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:41,108 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:41,610 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:42,111 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:42,612 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:43,113 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:43,614 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:44,118 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:44,620 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:49:48,657 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:49:48,667 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:49:49,671 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:50,174 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:50,675 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:51,175 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:51,677 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:52,178 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:52,680 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:53,182 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:53,683 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:49:57,696 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:49:57,698 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:49:58,701 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:59,204 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:49:59,705 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:00,206 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:00,707 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:01,209 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:01,712 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:02,214 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:02,715 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:03,217 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:50:07,236 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:50:07,242 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:50:08,246 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:08,747 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:09,249 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:09,750 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:10,251 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:10,752 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:11,254 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:11,756 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:12,258 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:50:16,281 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:50:16,292 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:50:17,297 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:17,799 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:18,300 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:18,801 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:19,302 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:19,803 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:20,305 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:20,807 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:21,309 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:50:25,325 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:50:25,331 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:50:26,334 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:26,836 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:27,337 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:27,838 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:28,340 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:28,841 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:29,343 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:29,845 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:30,347 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:50:34,369 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:50:34,378 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:50:35,383 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:35,885 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:36,387 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:36,887 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:37,388 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:37,890 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:38,392 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:38,894 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:39,396 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:50:43,417 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:50:43,422 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:50:44,426 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:44,928 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:45,429 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:45,930 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:46,431 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:46,933 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:47,435 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:47,937 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:48,438 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:50:52,454 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:50:52,462 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:50:53,466 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:53,969 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:54,470 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:54,971 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:55,472 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:55,974 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:56,476 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:56,978 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:50:57,480 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:51:01,498 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:51:01,501 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:51:02,506 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:03,007 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:03,508 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:04,009 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:04,510 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:05,011 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:05,512 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:06,014 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:06,516 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:07,020 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:51:11,041 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:51:11,051 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:51:12,056 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:12,557 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:13,058 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:13,564 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:14,065 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:14,566 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:15,068 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:15,570 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:51:19,590 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:51:19,599 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:51:20,603 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:21,106 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:21,607 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:22,108 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:22,610 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:23,110 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:23,612 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:24,114 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:24,618 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:51:28,642 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:51:28,652 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:51:29,657 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:30,159 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:30,660 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:31,161 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:31,662 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:32,163 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:32,664 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:33,166 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:33,668 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:34,172 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:51:38,189 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:51:38,191 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:51:39,194 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:39,697 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:40,198 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:40,700 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:41,200 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:41,702 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:42,204 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:42,706 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:43,207 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:51:47,221 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:51:47,224 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:51:48,228 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:48,730 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:49,231 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:49,732 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:50,233 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:50,737 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:51,239 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:51,741 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:52,242 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:51:56,257 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:51:56,262 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:51:57,265 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:57,767 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:58,269 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:58,770 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:59,271 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:51:59,772 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:00,274 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:00,777 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:01,280 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:52:05,304 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:52:05,306 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:52:06,311 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:06,813 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:07,314 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:07,818 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:08,319 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:08,820 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:09,321 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:09,823 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:10,325 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:10,826 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:52:14,844 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:52:14,848 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:52:15,852 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:16,353 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:16,854 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:17,356 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:17,857 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:18,358 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:18,860 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:19,362 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:19,863 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:52:23,876 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:52:23,880 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:52:24,884 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:25,385 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:25,887 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:26,388 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:26,889 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:27,395 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:27,896 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:28,397 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:28,898 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:29,400 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:52:32,408 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:52:32,418 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:52:33,422 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:33,925 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:34,427 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:34,928 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:35,429 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:35,930 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:36,431 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:36,934 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:37,436 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:37,938 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000023, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:52:41,965 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:52:41,967 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:52:42,971 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:43,473 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:43,974 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:44,477 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:44,978 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:45,479 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:45,981 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:46,483 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:46,985 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:47,486 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000032, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:52:51,511 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:52:51,512 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:52:52,517 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:53,018 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:53,519 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:54,021 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:54,522 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:55,024 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:55,525 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:52:56,027 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000032, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:53:00,041 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:53:00,051 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:53:01,054 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:53:01,557 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:53:02,058 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:53:02,560 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:53:03,061 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:53:03,562 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:53:04,063 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:53:04,566 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:53:05,068 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:53:05,569 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=3,name=FileOutput,type=GENERIC,checkpoint={5b28089000000032, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=writer,sourceNodeId=2,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[]]] 2018-06-18 19:53:09,582 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [3] 2018-06-18 19:53:09,589 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:53:10,592 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:53:11,094 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:53:11,595 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:53:12,096 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:53:12,597 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:53:13,098 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:53:13,601 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. 2018-06-18 19:53:14,103 INFO com.datatorrent.stram.engine.StreamingContainer: Waiting for pending request. End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000136 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20915 Log Contents: 2018-06-18 19:51:49,847 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:51:51,021 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000136/tmp as the basepath for spooling. 2018-06-18 19:51:51,026 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:45059 2018-06-18 19:51:52,117 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:51:52,222 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:51:52,246 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:45059/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:51:52,303 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000136/tmp/chkp1248757487826573539 as the basepath for checkpointing. 2018-06-18 19:51:52,310 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:51:52,426 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@2b470692 for node 2 2018-06-18 19:51:52,538 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:51:52,538 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:51:54,253 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:51:54,257 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:51:54,262 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@128c160fidentifier=tcp://laptop-name:45059/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3bf16e8a{da=com.datatorrent.bufferserver.internal.DataList$Block@1ea230d6{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=572, starting_window=5b28089000000001, ending_window=5b28089000000008, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@533f75a8[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000103 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23944 Log Contents: 2018-06-18 19:46:46,088 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:46:47,301 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000103/tmp as the basepath for spooling. 2018-06-18 19:46:47,306 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:36409 2018-06-18 19:46:48,400 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:46:48,498 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:46:48,572 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000103/tmp/chkp9159221223103219210 as the basepath for checkpointing. 2018-06-18 19:46:48,576 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:46:48,696 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@1e3c9f96 for node 2 2018-06-18 19:46:48,702 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:46:48,703 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745801_4977 2018-06-18 19:46:48,705 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:46:48,706 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:46:48,731 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:36409/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:46:50,523 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:46:50,526 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:46:50,530 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@2422d06identifier=tcp://laptop-name:36409/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4d7d5f6{da=com.datatorrent.bufferserver.internal.DataList$Block@5daa90a9{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=127, starting_window=5b28089000000001, ending_window=5b28089000000003, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@73f5369c[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000070 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:41:45,399 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:41:46,652 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000070/tmp as the basepath for spooling. 2018-06-18 19:41:46,657 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:45325 2018-06-18 19:41:47,773 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:41:47,852 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:45325/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:41:47,882 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:41:47,935 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000070/tmp/chkp1164781853169176949 as the basepath for checkpointing. 2018-06-18 19:41:47,951 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:41:48,064 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@3553520a for node 2 2018-06-18 19:41:48,077 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:41:48,078 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745171_4347 2018-06-18 19:41:48,080 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:41:48,081 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:41:49,919 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:41:49,922 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:41:49,926 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@3e56824cidentifier=tcp://laptop-name:45325/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@c1aadae{da=com.datatorrent.bufferserver.internal.DataList$Block@7e9c81e5{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1943, starting_window=5b28089000000001, ending_window=5b28089000000011, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@60398460[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000029 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22167 Log Contents: 2018-06-18 19:35:31,462 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:35:32,614 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000029/tmp as the basepath for spooling. 2018-06-18 19:35:32,618 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:45093 2018-06-18 19:35:33,694 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:35:33,776 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:45093/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:35:33,800 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:35:33,885 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000029/tmp/chkp1605184838716694061 as the basepath for checkpointing. 2018-06-18 19:35:33,899 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:35:34,012 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@4e6b7cea for node 2 2018-06-18 19:35:34,017 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:35:34,017 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744387_3563 2018-06-18 19:35:34,020 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:35:35,849 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:35:35,851 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:35:35,855 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@75c4a6f5identifier=tcp://laptop-name:45093/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3876e333{da=com.datatorrent.bufferserver.internal.DataList$Block@2b0e11ac{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1754, starting_window=5b28089000000001, ending_window=5b28089000000010, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@76dc68ef[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000128 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20916 Log Contents: 2018-06-18 19:50:36,792 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:50:38,006 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000128/tmp as the basepath for spooling. 2018-06-18 19:50:38,010 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:40053 2018-06-18 19:50:39,102 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:50:39,203 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:50:39,281 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000128/tmp/chkp7642333660168185998 as the basepath for checkpointing. 2018-06-18 19:50:39,298 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:50:39,402 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:40053/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:50:39,408 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@7a244652 for node 2 2018-06-18 19:50:39,519 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:50:39,520 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:50:41,223 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:50:41,226 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:50:41,229 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@74dd236bidentifier=tcp://laptop-name:40053/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@28981815{da=com.datatorrent.bufferserver.internal.DataList$Block@1fb29b3c{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=2789, starting_window=5b28089000000001, ending_window=5b28089000000015, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@25c34a57[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000095 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20914 Log Contents: 2018-06-18 19:45:33,270 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:45:34,521 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000095/tmp as the basepath for spooling. 2018-06-18 19:45:34,526 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:35649 2018-06-18 19:45:35,618 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:45:35,695 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:45:35,784 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000095/tmp/chkp2840438014793404635 as the basepath for checkpointing. 2018-06-18 19:45:35,793 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:45:35,898 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:35649/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:45:35,911 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@9ad85c2 for node 2 2018-06-18 19:45:36,025 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:45:36,026 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:45:37,719 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:45:37,720 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:45:37,721 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@4b5cedcaidentifier=tcp://laptop-name:35649/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@38ce5688{da=com.datatorrent.bufferserver.internal.DataList$Block@21a1abc0{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=947, starting_window=5b28089000000001, ending_window=5b2808900000000b, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@4ea15743[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000062 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:40:32,356 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:40:33,558 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000062/tmp as the basepath for spooling. 2018-06-18 19:40:33,562 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:39207 2018-06-18 19:40:34,642 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:40:34,766 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:40:34,847 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000062/tmp/chkp3294809690533131224 as the basepath for checkpointing. 2018-06-18 19:40:34,869 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:40:34,939 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:39207/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:40:34,988 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@77f4b26c for node 2 2018-06-18 19:40:35,006 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:40:35,007 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745018_4194 2018-06-18 19:40:35,008 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:40:35,010 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:40:36,795 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:40:36,798 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:40:36,803 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@c2e3711identifier=tcp://laptop-name:39207/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7c44070f{da=com.datatorrent.bufferserver.internal.DataList$Block@5ba3f616{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1574, starting_window=5b28089000000001, ending_window=5b2808900000000f, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@29267927[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000021 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:34:18,557 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:34:19,726 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000021/tmp as the basepath for spooling. 2018-06-18 19:34:19,730 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:39031 2018-06-18 19:34:20,816 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:34:20,847 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:39031/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:34:20,920 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:34:21,005 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000021/tmp/chkp1454592531345705187 as the basepath for checkpointing. 2018-06-18 19:34:21,015 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:34:21,154 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@62c2bd36 for node 2 2018-06-18 19:34:21,199 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:34:21,200 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744236_3412 2018-06-18 19:34:21,202 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:34:21,203 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:34:22,949 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:34:22,951 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:34:22,956 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@30a7e3a3identifier=tcp://laptop-name:39031/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1b53ddf0{da=com.datatorrent.bufferserver.internal.DataList$Block@3489dce8{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=572, starting_window=5b28089000000001, ending_window=5b28089000000008, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@146f8a8f[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000120 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23945 Log Contents: 2018-06-18 19:49:23,938 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:49:25,274 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000120/tmp as the basepath for spooling. 2018-06-18 19:49:25,278 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:37023 2018-06-18 19:49:26,359 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:49:26,448 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:49:26,518 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000120/tmp/chkp8872913762339120484 as the basepath for checkpointing. 2018-06-18 19:49:26,522 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:49:26,541 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:37023/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:49:26,643 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@79ef330d for node 2 2018-06-18 19:49:26,658 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:49:26,659 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746133_5309 2018-06-18 19:49:26,661 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:49:26,662 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:49:28,472 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:49:28,475 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:49:28,484 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@60a23b48identifier=tcp://laptop-name:37023/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@59336757{da=com.datatorrent.bufferserver.internal.DataList$Block@7b4e9952{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=127, starting_window=5b28089000000001, ending_window=5b28089000000003, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@f163f31[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000087 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20915 Log Contents: 2018-06-18 19:44:20,363 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:44:21,509 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000087/tmp as the basepath for spooling. 2018-06-18 19:44:21,512 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:36277 2018-06-18 19:44:22,582 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:44:22,585 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:36277/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:44:22,681 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:44:22,738 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000087/tmp/chkp3966397242338848900 as the basepath for checkpointing. 2018-06-18 19:44:22,765 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:44:22,959 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@3b79f9c7 for node 2 2018-06-18 19:44:22,981 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:44:22,981 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:44:24,701 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:44:24,704 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:44:24,708 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@31936000identifier=tcp://laptop-name:36277/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5e5a4838{da=com.datatorrent.bufferserver.internal.DataList$Block@181ca1c7{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=947, starting_window=5b28089000000001, ending_window=5b2808900000000b, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@7cdf5fef[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000054 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20914 Log Contents: 2018-06-18 19:39:19,400 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:39:20,586 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000054/tmp as the basepath for spooling. 2018-06-18 19:39:20,590 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:36435 2018-06-18 19:39:21,676 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:39:21,797 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:39:21,873 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000054/tmp/chkp1065629218119026026 as the basepath for checkpointing. 2018-06-18 19:39:21,887 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:39:22,001 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@453af780 for node 2 2018-06-18 19:39:22,094 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:36435/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:39:22,129 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:39:22,129 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:39:23,831 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:39:23,834 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:39:23,838 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@e0ab770identifier=tcp://laptop-name:36435/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@748daca3{da=com.datatorrent.bufferserver.internal.DataList$Block@734cd0b{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=2141, starting_window=5b28089000000001, ending_window=5b28089000000012, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@49919f15[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000013 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:33:05,511 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:33:06,799 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000013/tmp as the basepath for spooling. 2018-06-18 19:33:06,804 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:36913 2018-06-18 19:33:07,908 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:33:08,013 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:33:08,085 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000013/tmp/chkp8570387428943163900 as the basepath for checkpointing. 2018-06-18 19:33:08,089 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:33:08,209 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@672a71a6 for node 2 2018-06-18 19:33:08,217 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:33:08,217 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744082_3258 2018-06-18 19:33:08,219 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:33:08,220 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:33:08,284 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:36913/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:33:10,034 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:33:10,044 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@21eba743identifier=tcp://laptop-name:36913/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2c2809f7{da=com.datatorrent.bufferserver.internal.DataList$Block@4a479813{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=198, starting_window=5b28089000000001, ending_window=5b28089000000004, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@2e773114[identifier=2.out.1] 2018-06-18 19:33:10,044 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000145 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:19707 Log Contents: 2018-06-18 19:53:11,794 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:53:13,144 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000145/tmp as the basepath for spooling. 2018-06-18 19:53:13,148 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:34857 2018-06-18 19:53:14,235 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:53:14,336 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:53:14,402 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000145/tmp/chkp911871434333604293 as the basepath for checkpointing. 2018-06-18 19:53:14,412 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:53:14,532 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@68594e71 for node 2 2018-06-18 19:53:14,539 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.addBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:418) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.addBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1455) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1251) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000112 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23944 Log Contents: 2018-06-18 19:48:10,101 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:48:11,413 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000112/tmp as the basepath for spooling. 2018-06-18 19:48:11,416 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:35453 2018-06-18 19:48:12,507 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:48:12,639 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:48:12,657 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:35453/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:48:12,718 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000112/tmp/chkp1473473480174386193 as the basepath for checkpointing. 2018-06-18 19:48:12,724 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:48:12,850 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@967b1df for node 2 2018-06-18 19:48:12,868 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:48:12,869 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745978_5154 2018-06-18 19:48:12,870 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:48:12,871 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:48:14,672 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:48:14,675 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:48:14,679 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6647cb95identifier=tcp://laptop-name:35453/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@edfdbee{da=com.datatorrent.bufferserver.internal.DataList$Block@56232b34{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=367, starting_window=5b28089000000001, ending_window=5b28089000000006, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@40b6256c[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000079 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20915 Log Contents: 2018-06-18 19:43:07,365 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:43:08,513 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000079/tmp as the basepath for spooling. 2018-06-18 19:43:08,516 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:33615 2018-06-18 19:43:09,585 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:43:09,656 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:43:09,729 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000079/tmp/chkp9008885823515272828 as the basepath for checkpointing. 2018-06-18 19:43:09,750 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:43:09,757 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:33615/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:43:09,873 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@59eef14e for node 2 2018-06-18 19:43:09,996 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:43:09,996 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:43:11,697 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:43:11,700 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:43:11,704 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@e37fff6identifier=tcp://laptop-name:33615/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@342ac17e{da=com.datatorrent.bufferserver.internal.DataList$Block@58a8ddd7{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=3023, starting_window=5b28089000000001, ending_window=5b28089000000016, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@4fdcb5f8[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000046 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:38:06,487 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:38:07,672 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000046/tmp as the basepath for spooling. 2018-06-18 19:38:07,675 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:37299 2018-06-18 19:38:08,749 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:38:08,820 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:38:08,897 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000046/tmp/chkp5916531878652087716 as the basepath for checkpointing. 2018-06-18 19:38:08,920 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:38:09,028 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@ad80302 for node 2 2018-06-18 19:38:09,035 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:38:09,036 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744713_3889 2018-06-18 19:38:09,038 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:38:09,039 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:38:09,175 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:37299/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:38:10,843 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:38:10,846 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:38:10,850 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@2d2a82e6identifier=tcp://laptop-name:37299/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@21fa88cb{da=com.datatorrent.bufferserver.internal.DataList$Block@6db45ff2{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=3518, starting_window=5b28089000000001, ending_window=5b28089000000018, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@41db114c[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000005 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22166 Log Contents: 2018-06-18 19:31:52,260 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:31:53,439 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000005/tmp as the basepath for spooling. 2018-06-18 19:31:53,443 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:45867 2018-06-18 19:31:54,536 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:31:54,629 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:45867/2.out.1, windowId=5b2808900000000c, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:31:54,664 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:31:54,725 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000005/tmp/chkp5885207272480804325 as the basepath for checkpointing. 2018-06-18 19:31:54,738 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:31:54,862 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@246d2949 for node 2 2018-06-18 19:31:54,863 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:31:54,863 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073743929_3105 2018-06-18 19:31:54,867 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:31:56,709 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:31:56,816 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:31:56,817 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@38594166identifier=tcp://laptop-name:45867/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@55c9409{da=com.datatorrent.bufferserver.internal.DataList$Block@793e6e62{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1574, starting_window=5b28089000000001, ending_window=5b2808900000000f, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@6207bea6[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000137 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:21709 Log Contents: 2018-06-18 19:51:58,823 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:51:59,987 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000137/tmp as the basepath for spooling. 2018-06-18 19:51:59,991 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:42923 2018-06-18 19:52:01,102 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:52:01,202 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:52:01,286 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:42923/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:52:01,317 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000137/tmp/chkp4449498538856649871 as the basepath for checkpointing. 2018-06-18 19:52:01,371 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:52:01,488 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@2ff43f05 for node 2 2018-06-18 19:52:01,491 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:659) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531) at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1533) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1309) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:52:01,491 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746458_5634 2018-06-18 19:52:01,501 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:52:03,243 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:52:03,248 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:52:03,256 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@e65807fidentifier=tcp://laptop-name:42923/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@599199fe{da=com.datatorrent.bufferserver.internal.DataList$Block@61634cec{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1754, starting_window=5b28089000000001, ending_window=5b28089000000010, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@360fadf8[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000104 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20916 Log Contents: 2018-06-18 19:46:55,101 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:46:56,310 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000104/tmp as the basepath for spooling. 2018-06-18 19:46:56,314 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:35537 2018-06-18 19:46:57,403 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:46:57,513 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:46:57,586 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000104/tmp/chkp2477538151289427280 as the basepath for checkpointing. 2018-06-18 19:46:57,604 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:46:57,715 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@5ca9a79d for node 2 2018-06-18 19:46:57,803 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:35537/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:46:57,833 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:46:57,834 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:46:59,547 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:46:59,549 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:46:59,556 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@439325c0identifier=tcp://laptop-name:35537/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6b1e5861{da=com.datatorrent.bufferserver.internal.DataList$Block@5f00f77a{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=4328, starting_window=5b28089000000001, ending_window=5b2808900000001b, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@6a5fcb37[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000071 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20915 Log Contents: 2018-06-18 19:41:54,451 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:41:55,656 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000071/tmp as the basepath for spooling. 2018-06-18 19:41:55,661 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:46773 2018-06-18 19:41:56,779 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:41:56,859 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:41:56,905 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:46773/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:41:56,973 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000071/tmp/chkp5674426144103178754 as the basepath for checkpointing. 2018-06-18 19:41:56,986 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:41:57,102 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@496f58ff for node 2 2018-06-18 19:41:57,228 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:41:57,228 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:41:58,889 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:41:58,892 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:41:58,897 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6647cb95identifier=tcp://laptop-name:46773/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@edfdbee{da=com.datatorrent.bufferserver.internal.DataList$Block@56232b34{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1403, starting_window=5b28089000000001, ending_window=5b2808900000000e, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@40b6256c[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000038 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23944 Log Contents: 2018-06-18 19:36:53,621 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:36:54,800 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000038/tmp as the basepath for spooling. 2018-06-18 19:36:54,803 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:35145 2018-06-18 19:36:55,897 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:36:55,974 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:36:56,047 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000038/tmp/chkp2463174396074597185 as the basepath for checkpointing. 2018-06-18 19:36:56,053 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:36:56,172 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@10eade5d for node 2 2018-06-18 19:36:56,173 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:36:56,174 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744559_3735 2018-06-18 19:36:56,176 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:36:56,177 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:36:56,292 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:35145/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:36:57,991 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:36:57,994 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:36:57,998 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@c311cb8identifier=tcp://laptop-name:35145/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5ebdb22{da=com.datatorrent.bufferserver.internal.DataList$Block@1a5597d4{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=278, starting_window=5b28089000000001, ending_window=5b28089000000005, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@54a01df1[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000129 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22166 Log Contents: 2018-06-18 19:50:45,884 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:50:47,052 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000129/tmp as the basepath for spooling. 2018-06-18 19:50:47,055 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:35615 2018-06-18 19:50:48,136 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:50:48,243 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:50:48,314 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000129/tmp/chkp6921052494462167715 as the basepath for checkpointing. 2018-06-18 19:50:48,318 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:50:48,439 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@68910a9f for node 2 2018-06-18 19:50:48,443 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:35615/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:50:48,444 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:50:48,454 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746306_5482 2018-06-18 19:50:48,461 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:50:50,271 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:50:50,273 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:50:50,279 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@191e1b11identifier=tcp://laptop-name:35615/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1d16c736{da=com.datatorrent.bufferserver.internal.DataList$Block@108a4b31{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=127, starting_window=5b28089000000001, ending_window=5b28089000000003, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@318909e5[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000096 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20916 Log Contents: 2018-06-18 19:45:42,381 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:45:43,709 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000096/tmp as the basepath for spooling. 2018-06-18 19:45:43,713 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:44345 2018-06-18 19:45:44,796 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:45:44,893 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:45:44,943 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:44345/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:45:44,973 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000096/tmp/chkp4582775917150669854 as the basepath for checkpointing. 2018-06-18 19:45:44,986 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:45:45,179 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@60ebe88f for node 2 2018-06-18 19:45:45,203 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:45:45,203 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:45:46,928 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:45:46,931 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:45:46,936 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@18e325e7identifier=tcp://laptop-name:44345/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6046d641{da=com.datatorrent.bufferserver.internal.DataList$Block@25d6bd28{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1754, starting_window=5b28089000000001, ending_window=5b28089000000010, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@192c3a25[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000063 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22165 Log Contents: 2018-06-18 19:40:41,455 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:40:42,643 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000063/tmp as the basepath for spooling. 2018-06-18 19:40:42,647 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:42113 2018-06-18 19:40:43,741 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:40:43,833 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:40:43,902 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000063/tmp/chkp4075224755798798220 as the basepath for checkpointing. 2018-06-18 19:40:43,906 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:40:43,989 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:42113/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:40:44,028 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@5edf1694 for node 2 2018-06-18 19:40:44,057 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:40:44,058 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745037_4213 2018-06-18 19:40:44,061 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:40:45,858 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:40:45,860 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:40:45,865 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@3fe979b3identifier=tcp://laptop-name:42113/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2f6a428f{da=com.datatorrent.bufferserver.internal.DataList$Block@295033f8{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=127, starting_window=5b28089000000001, ending_window=5b28089000000003, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@d36eb9c[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000030 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23947 Log Contents: 2018-06-18 19:35:40,596 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:35:41,788 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000030/tmp as the basepath for spooling. 2018-06-18 19:35:41,793 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:37837 2018-06-18 19:35:42,906 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:35:42,983 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:35:43,060 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000030/tmp/chkp1000613872895387744 as the basepath for checkpointing. 2018-06-18 19:35:43,070 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:35:43,186 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@68594e71 for node 2 2018-06-18 19:35:43,188 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:35:43,189 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744406_3582 2018-06-18 19:35:43,191 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:35:43,192 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:35:43,331 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:37837/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:35:45,004 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:35:45,007 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:35:45,012 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@789e365bidentifier=tcp://laptop-name:37837/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7456fb9a{da=com.datatorrent.bufferserver.internal.DataList$Block@4b2329ae{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1242, starting_window=5b28089000000001, ending_window=5b2808900000000d, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@312325ad[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000121 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:49:33,076 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:49:34,274 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000121/tmp as the basepath for spooling. 2018-06-18 19:49:34,277 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:39989 2018-06-18 19:49:35,366 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:49:35,464 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:49:35,542 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000121/tmp/chkp2846041292138791853 as the basepath for checkpointing. 2018-06-18 19:49:35,548 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:49:35,584 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:39989/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:49:35,668 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@57a41695 for node 2 2018-06-18 19:49:35,706 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:49:35,707 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746152_5328 2018-06-18 19:49:35,709 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:49:35,711 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:49:37,490 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:49:37,493 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:49:37,497 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@571ce084identifier=tcp://laptop-name:39989/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5d37ccc0{da=com.datatorrent.bufferserver.internal.DataList$Block@2ee26b2f{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=367, starting_window=5b28089000000001, ending_window=5b28089000000006, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@13abe35d[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000088 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:19005 Log Contents: 2018-06-18 19:44:29,456 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:44:30,750 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000088/tmp as the basepath for spooling. 2018-06-18 19:44:30,754 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:45081 2018-06-18 19:44:31,842 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:44:31,988 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:44:32,069 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:44:32,121 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:45081/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:44:34,017 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:44:34,019 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:44:34,025 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@218393f6identifier=tcp://laptop-name:45081/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@fe8249f{da=com.datatorrent.bufferserver.internal.DataList$Block@52f8d71c{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=12, starting_window=5b28089000000001, ending_window=5b28089000000001, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@359e13f3[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000055 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22167 Log Contents: 2018-06-18 19:39:28,532 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:39:29,750 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000055/tmp as the basepath for spooling. 2018-06-18 19:39:29,755 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:34603 2018-06-18 19:39:30,853 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:39:30,966 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:39:31,040 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000055/tmp/chkp2512252908622003347 as the basepath for checkpointing. 2018-06-18 19:39:31,053 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:39:31,082 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:34603/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:39:31,169 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@31928122 for node 2 2018-06-18 19:39:31,191 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:39:31,192 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744886_4062 2018-06-18 19:39:31,195 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:39:32,998 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:39:33,001 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:39:33,007 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@733701c4identifier=tcp://laptop-name:34603/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@36165dda{da=com.datatorrent.bufferserver.internal.DataList$Block@7454cafb{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1574, starting_window=5b28089000000001, ending_window=5b2808900000000f, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@573cf2ee[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000022 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20915 Log Contents: 2018-06-18 19:34:27,698 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:34:28,997 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000022/tmp as the basepath for spooling. 2018-06-18 19:34:29,001 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:41795 2018-06-18 19:34:30,098 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:34:30,218 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:34:30,291 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000022/tmp/chkp8582831936439772991 as the basepath for checkpointing. 2018-06-18 19:34:30,301 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:34:30,401 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41795/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:34:30,417 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@27126894 for node 2 2018-06-18 19:34:30,524 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:34:30,524 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:34:32,245 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:34:32,248 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:34:32,253 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@447b91deidentifier=tcp://laptop-name:41795/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@23058507{da=com.datatorrent.bufferserver.internal.DataList$Block@6372a670{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=947, starting_window=5b28089000000001, ending_window=5b2808900000000b, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@2b33a76b[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000138 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22166 Log Contents: 2018-06-18 19:52:08,348 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:52:09,587 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000138/tmp as the basepath for spooling. 2018-06-18 19:52:09,591 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:40339 2018-06-18 19:52:10,692 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:52:10,824 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:52:10,848 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:40339/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:52:10,932 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000138/tmp/chkp5514880276589015530 as the basepath for checkpointing. 2018-06-18 19:52:10,941 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:52:11,069 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@4035a087 for node 2 2018-06-18 19:52:11,085 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:52:11,086 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746478_5654 2018-06-18 19:52:11,089 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:52:12,841 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:52:12,845 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:52:12,849 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@468cabafidentifier=tcp://laptop-name:40339/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@70de977c{da=com.datatorrent.bufferserver.internal.DataList$Block@47ebb1a2{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=367, starting_window=5b28089000000001, ending_window=5b28089000000006, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@425f0c68[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000113 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:48:19,152 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:48:20,338 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000113/tmp as the basepath for spooling. 2018-06-18 19:48:20,342 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:45449 2018-06-18 19:48:21,432 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:48:21,524 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:48:21,596 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000113/tmp/chkp6894911037651217828 as the basepath for checkpointing. 2018-06-18 19:48:21,607 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:48:21,699 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:45449/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:48:21,724 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@4fecfcd3 for node 2 2018-06-18 19:48:21,751 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:48:21,752 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745996_5172 2018-06-18 19:48:21,755 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:48:21,756 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:48:23,554 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:48:23,556 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:48:23,560 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@446242bfidentifier=tcp://laptop-name:45449/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1f4fdfef{da=com.datatorrent.bufferserver.internal.DataList$Block@7a2fdc54{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=813, starting_window=5b28089000000001, ending_window=5b2808900000000a, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@2a136fdf[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000080 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20817 Log Contents: 2018-06-18 19:43:16,957 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:43:18,350 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000080/tmp as the basepath for spooling. 2018-06-18 19:43:18,355 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:38029 2018-06-18 19:43:19,478 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:43:19,622 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:43:19,711 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000080/tmp/chkp481603163293550443 as the basepath for checkpointing. 2018-06-18 19:43:19,719 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:43:19,790 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:38029/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:43:19,840 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@39665c34 for node 2 2018-06-18 19:43:19,856 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.addBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:418) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.addBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1455) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1251) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:43:21,646 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:43:21,647 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:43:21,649 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@1275ec47identifier=tcp://laptop-name:38029/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@257a74b9{da=com.datatorrent.bufferserver.internal.DataList$Block@43bf8978{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=198, starting_window=5b28089000000001, ending_window=5b28089000000004, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@298659dd[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000047 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22165 Log Contents: 2018-06-18 19:38:15,626 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:38:16,803 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000047/tmp as the basepath for spooling. 2018-06-18 19:38:16,806 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:38149 2018-06-18 19:38:17,879 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:38:17,966 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:38:18,038 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000047/tmp/chkp4680227352110889593 as the basepath for checkpointing. 2018-06-18 19:38:18,044 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:38:18,164 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@40733435 for node 2 2018-06-18 19:38:18,181 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:38:18,181 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744732_3908 2018-06-18 19:38:18,185 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:38:18,212 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:38149/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:38:19,996 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:38:20,002 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:38:20,005 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@5dfe139aidentifier=tcp://laptop-name:38149/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@72d65ac{da=com.datatorrent.bufferserver.internal.DataList$Block@45a8e759{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=465, starting_window=5b28089000000001, ending_window=5b28089000000007, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@1738fe87[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000014 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20913 Log Contents: 2018-06-18 19:33:14,688 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:33:15,904 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000014/tmp as the basepath for spooling. 2018-06-18 19:33:15,909 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:36729 2018-06-18 19:33:17,016 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:33:17,162 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:33:17,236 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000014/tmp/chkp5217337668750303622 as the basepath for checkpointing. 2018-06-18 19:33:17,257 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:33:17,347 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:36729/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:33:17,367 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@4a10b7ab for node 2 2018-06-18 19:33:17,474 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:33:17,474 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:33:19,183 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:33:19,187 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:33:19,192 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@188f4edaidentifier=tcp://laptop-name:36729/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@e41bd41{da=com.datatorrent.bufferserver.internal.DataList$Block@3c81bb9{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1943, starting_window=5b28089000000001, ending_window=5b28089000000011, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@972d2b0[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000130 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:50:55,025 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:50:56,205 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000130/tmp as the basepath for spooling. 2018-06-18 19:50:56,209 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:43347 2018-06-18 19:50:57,331 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:50:57,419 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:50:57,487 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:43347/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:50:57,499 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000130/tmp/chkp1273579019554544615 as the basepath for checkpointing. 2018-06-18 19:50:57,511 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:50:57,629 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@2c5fa0da for node 2 2018-06-18 19:50:57,640 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:50:57,640 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746325_5501 2018-06-18 19:50:57,643 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:50:57,644 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:50:59,448 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:50:59,451 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:50:59,455 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@1b0e26fdidentifier=tcp://laptop-name:43347/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@72a5ba46{da=com.datatorrent.bufferserver.internal.DataList$Block@64594dad{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=572, starting_window=5b28089000000001, ending_window=5b28089000000008, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@33bd8f19[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000105 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22164 Log Contents: 2018-06-18 19:47:04,352 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:47:05,515 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000105/tmp as the basepath for spooling. 2018-06-18 19:47:05,520 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:44623 2018-06-18 19:47:06,615 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:47:06,740 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:47:06,802 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:44623/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:47:06,828 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000105/tmp/chkp7623899298296728445 as the basepath for checkpointing. 2018-06-18 19:47:06,838 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:47:06,956 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@9391f1 for node 2 2018-06-18 19:47:06,978 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:47:06,978 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745840_5016 2018-06-18 19:47:06,982 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:47:08,770 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:47:08,773 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:47:08,777 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@177a3f99identifier=tcp://laptop-name:44623/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@57918e12{da=com.datatorrent.bufferserver.internal.DataList$Block@25caadaa{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=465, starting_window=5b28089000000001, ending_window=5b28089000000007, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@1ad8f344[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000072 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20914 Log Contents: 2018-06-18 19:42:03,559 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:42:04,771 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000072/tmp as the basepath for spooling. 2018-06-18 19:42:04,774 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:38865 2018-06-18 19:42:05,848 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:42:05,945 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:38865/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:42:05,948 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:42:06,038 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000072/tmp/chkp1008018455219428639 as the basepath for checkpointing. 2018-06-18 19:42:06,042 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:42:06,165 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@58e6066d for node 2 2018-06-18 19:42:06,281 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:42:06,281 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:42:07,979 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:42:07,982 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:42:07,986 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@3db0451identifier=tcp://laptop-name:38865/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@118bced0{da=com.datatorrent.bufferserver.internal.DataList$Block@68948ac1{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=127, starting_window=5b28089000000001, ending_window=5b28089000000003, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@58796c3e[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000039 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:37:02,697 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:37:03,862 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000039/tmp as the basepath for spooling. 2018-06-18 19:37:03,865 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:38997 2018-06-18 19:37:04,932 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:37:05,023 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:37:05,098 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000039/tmp/chkp3804800498844736033 as the basepath for checkpointing. 2018-06-18 19:37:05,102 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:37:05,222 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@1f495c22 for node 2 2018-06-18 19:37:05,226 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:37:05,227 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744578_3754 2018-06-18 19:37:05,229 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:37:05,230 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:37:05,334 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:38997/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:37:07,038 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:37:07,041 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:37:07,045 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@57249db8identifier=tcp://laptop-name:38997/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3f362b34{da=com.datatorrent.bufferserver.internal.DataList$Block@2ab7794b{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=127, starting_window=5b28089000000001, ending_window=5b28089000000003, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@3a2f5eba[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000006 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:32:01,383 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:32:02,594 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000006/tmp as the basepath for spooling. 2018-06-18 19:32:02,599 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:40735 2018-06-18 19:32:03,701 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:32:03,732 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:40735/2.out.1, windowId=5b2808900000000d, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:32:03,811 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:32:03,866 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000006/tmp/chkp2331703142762429214 as the basepath for checkpointing. 2018-06-18 19:32:03,879 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:32:03,995 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@5887249b for node 2 2018-06-18 19:32:04,006 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:32:04,006 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073743948_3124 2018-06-18 19:32:04,008 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:32:04,009 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:32:05,850 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:32:05,853 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:32:05,857 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@1b0a857cidentifier=tcp://laptop-name:40735/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6c84c20e{da=com.datatorrent.bufferserver.internal.DataList$Block@24c62903{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=572, starting_window=5b28089000000001, ending_window=5b28089000000008, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@40817fe7[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000122 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23945 Log Contents: 2018-06-18 19:49:42,224 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:49:43,482 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000122/tmp as the basepath for spooling. 2018-06-18 19:49:43,487 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:35815 2018-06-18 19:49:44,561 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:49:44,661 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:35815/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:49:44,778 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:49:44,867 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000122/tmp/chkp1704060782428428062 as the basepath for checkpointing. 2018-06-18 19:49:44,896 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:49:45,006 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@77d5eb99 for node 2 2018-06-18 19:49:45,015 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:49:45,015 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746172_5348 2018-06-18 19:49:45,017 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:49:45,018 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:49:46,821 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:49:46,823 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:49:46,828 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@7334a8fbidentifier=tcp://laptop-name:35815/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@e60ef9b{da=com.datatorrent.bufferserver.internal.DataList$Block@ce3deff{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=3779, starting_window=5b28089000000001, ending_window=5b28089000000019, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@5e6f7eab[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000097 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22165 Log Contents: 2018-06-18 19:45:51,499 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:45:52,677 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000097/tmp as the basepath for spooling. 2018-06-18 19:45:52,682 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:38663 2018-06-18 19:45:53,743 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:45:53,841 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:45:53,913 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000097/tmp/chkp1361922769023723391 as the basepath for checkpointing. 2018-06-18 19:45:53,917 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:45:53,982 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:38663/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:45:54,037 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@65b3d1d4 for node 2 2018-06-18 19:45:54,042 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:45:54,043 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745686_4862 2018-06-18 19:45:54,046 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:45:55,864 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:45:55,865 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:45:55,868 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@191e1b11identifier=tcp://laptop-name:38663/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1d16c736{da=com.datatorrent.bufferserver.internal.DataList$Block@108a4b31{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=65, starting_window=5b28089000000001, ending_window=5b28089000000002, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@318909e5[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000064 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20913 Log Contents: 2018-06-18 19:40:50,616 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:40:51,905 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000064/tmp as the basepath for spooling. 2018-06-18 19:40:51,911 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:46485 2018-06-18 19:40:53,042 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:40:53,070 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:46485/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:40:53,162 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:40:53,222 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000064/tmp/chkp205447325782780676 as the basepath for checkpointing. 2018-06-18 19:40:53,230 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:40:53,428 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@5b552e78 for node 2 2018-06-18 19:40:53,452 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:40:53,452 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:40:55,191 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:40:55,195 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:40:55,199 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@2ab1d77bidentifier=tcp://laptop-name:46485/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@ad52184{da=com.datatorrent.bufferserver.internal.DataList$Block@1c1d1077{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=813, starting_window=5b28089000000001, ending_window=5b2808900000000a, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@61810ddb[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000031 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23944 Log Contents: 2018-06-18 19:35:49,787 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:35:50,939 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000031/tmp as the basepath for spooling. 2018-06-18 19:35:50,942 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:36333 2018-06-18 19:35:51,985 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:35:52,094 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:35:52,171 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000031/tmp/chkp8250974497211449950 as the basepath for checkpointing. 2018-06-18 19:35:52,174 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:35:52,296 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@2ccd6052 for node 2 2018-06-18 19:35:52,301 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:35:52,301 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744426_3602 2018-06-18 19:35:52,303 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:35:52,304 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:35:52,391 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:36333/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:35:54,121 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:35:54,123 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:35:54,129 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@a9e869cidentifier=tcp://laptop-name:36333/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1451dfc6{da=com.datatorrent.bufferserver.internal.DataList$Block@2f6e02b3{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=65, starting_window=5b28089000000001, ending_window=5b28089000000002, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@45fbf1ff[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000114 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23944 Log Contents: 2018-06-18 19:48:28,260 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:48:29,460 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000114/tmp as the basepath for spooling. 2018-06-18 19:48:29,464 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:35137 2018-06-18 19:48:30,555 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:48:30,649 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:48:30,723 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000114/tmp/chkp2140289355380842550 as the basepath for checkpointing. 2018-06-18 19:48:30,728 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:48:30,748 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:35137/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:48:30,849 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@112dfc1b for node 2 2018-06-18 19:48:30,852 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:48:30,853 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746016_5192 2018-06-18 19:48:30,855 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:48:30,857 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:48:32,675 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:48:32,681 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:48:32,683 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@c311cb8identifier=tcp://laptop-name:35137/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5ebdb22{da=com.datatorrent.bufferserver.internal.DataList$Block@1a5597d4{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=278, starting_window=5b28089000000001, ending_window=5b28089000000005, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@54a01df1[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000089 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:44:38,453 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:44:39,669 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000089/tmp as the basepath for spooling. 2018-06-18 19:44:39,675 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:40401 2018-06-18 19:44:40,757 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:44:40,846 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:44:40,918 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000089/tmp/chkp8616532558639936827 as the basepath for checkpointing. 2018-06-18 19:44:40,922 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:44:41,043 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@79ef330d for node 2 2018-06-18 19:44:41,054 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:44:41,060 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745534_4710 2018-06-18 19:44:41,061 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:44:41,063 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:44:41,151 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:40401/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:44:42,862 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:44:42,876 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:44:42,877 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@571ce084identifier=tcp://laptop-name:40401/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5d37ccc0{da=com.datatorrent.bufferserver.internal.DataList$Block@2ee26b2f{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=127, starting_window=5b28089000000001, ending_window=5b28089000000003, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@13abe35d[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000056 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:39:37,631 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:39:38,787 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000056/tmp as the basepath for spooling. 2018-06-18 19:39:38,791 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:42857 2018-06-18 19:39:39,846 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:39:39,934 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:39:40,006 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000056/tmp/chkp1259214003480690763 as the basepath for checkpointing. 2018-06-18 19:39:40,025 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:39:40,121 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:42857/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:39:40,134 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@3f9376bd for node 2 2018-06-18 19:39:40,141 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:39:40,141 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744905_4081 2018-06-18 19:39:40,143 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:39:40,144 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:39:41,972 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:39:41,976 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:39:41,981 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@d6ae79fidentifier=tcp://laptop-name:42857/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@72cae3f0{da=com.datatorrent.bufferserver.internal.DataList$Block@1f07d0f0{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=3266, starting_window=5b28089000000001, ending_window=5b28089000000017, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@30513d0d[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000023 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:19006 Log Contents: 2018-06-18 19:34:36,788 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:34:37,950 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000023/tmp as the basepath for spooling. 2018-06-18 19:34:37,954 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:40679 2018-06-18 19:34:39,043 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:34:39,179 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:34:39,260 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:34:39,457 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:40679/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:34:41,204 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:34:41,207 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:34:41,214 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@1a908168identifier=tcp://laptop-name:40679/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@243e065a{da=com.datatorrent.bufferserver.internal.DataList$Block@6e8367ca{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=12, starting_window=5b28089000000001, ending_window=5b28089000000001, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@2c36dbd2[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000139 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22165 Log Contents: 2018-06-18 19:52:17,223 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:52:18,550 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000139/tmp as the basepath for spooling. 2018-06-18 19:52:18,554 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:35401 2018-06-18 19:52:19,660 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:52:19,756 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:52:19,833 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000139/tmp/chkp2340065780835210268 as the basepath for checkpointing. 2018-06-18 19:52:19,836 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:52:19,867 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:35401/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:52:19,957 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@79987a0d for node 2 2018-06-18 19:52:19,974 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:52:19,975 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746497_5673 2018-06-18 19:52:19,979 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:52:21,779 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:52:21,782 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:52:21,787 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@2b3cfac9identifier=tcp://laptop-name:35401/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@63a42200{da=com.datatorrent.bufferserver.internal.DataList$Block@37d1f65e{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=65, starting_window=5b28089000000001, ending_window=5b28089000000002, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@23af7454[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000106 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22167 Log Contents: 2018-06-18 19:47:13,338 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:47:14,517 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000106/tmp as the basepath for spooling. 2018-06-18 19:47:14,521 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:39641 2018-06-18 19:47:15,600 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:47:15,691 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:47:15,763 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000106/tmp/chkp8891064421523161596 as the basepath for checkpointing. 2018-06-18 19:47:15,772 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:47:15,846 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:39641/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:47:15,890 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@1a859e25 for node 2 2018-06-18 19:47:15,900 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:47:15,901 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745859_5035 2018-06-18 19:47:15,904 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:47:17,715 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:47:17,718 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:47:17,722 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6d6af294identifier=tcp://laptop-name:39641/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5a817f41{da=com.datatorrent.bufferserver.internal.DataList$Block@1e2473c2{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1090, starting_window=5b28089000000001, ending_window=5b2808900000000c, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@7b86945e[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000081 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23944 Log Contents: 2018-06-18 19:43:25,654 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:43:26,877 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000081/tmp as the basepath for spooling. 2018-06-18 19:43:26,881 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:40887 2018-06-18 19:43:27,973 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:43:28,079 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:43:28,158 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000081/tmp/chkp6193359908551493480 as the basepath for checkpointing. 2018-06-18 19:43:28,161 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:43:28,281 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@6997bac6 for node 2 2018-06-18 19:43:28,314 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:43:28,314 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745381_4557 2018-06-18 19:43:28,317 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:43:28,319 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:43:28,325 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:40887/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:43:30,108 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:43:30,110 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:43:30,115 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@a9e869cidentifier=tcp://laptop-name:40887/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1451dfc6{da=com.datatorrent.bufferserver.internal.DataList$Block@2f6e02b3{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=65, starting_window=5b28089000000001, ending_window=5b28089000000002, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@45fbf1ff[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000048 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23945 Log Contents: 2018-06-18 19:38:24,728 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:38:25,916 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000048/tmp as the basepath for spooling. 2018-06-18 19:38:25,920 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:36133 2018-06-18 19:38:27,014 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:38:27,164 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:38:27,236 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000048/tmp/chkp1754240326010441025 as the basepath for checkpointing. 2018-06-18 19:38:27,239 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:38:27,260 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:36133/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:38:27,365 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@2f8535c3 for node 2 2018-06-18 19:38:27,372 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:38:27,372 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744752_3928 2018-06-18 19:38:27,374 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:38:27,375 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:38:29,186 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:38:29,189 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:38:29,193 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@17216e01identifier=tcp://laptop-name:36133/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3bb5eca9{da=com.datatorrent.bufferserver.internal.DataList$Block@173f664e{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=65, starting_window=5b28089000000001, ending_window=5b28089000000002, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@2955713c[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000015 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22166 Log Contents: 2018-06-18 19:33:23,862 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:33:25,050 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000015/tmp as the basepath for spooling. 2018-06-18 19:33:25,055 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:34849 2018-06-18 19:33:26,147 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:33:26,281 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:33:26,357 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000015/tmp/chkp3631000194723315261 as the basepath for checkpointing. 2018-06-18 19:33:26,363 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:33:26,413 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:34849/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:33:26,486 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@5abcc17a for node 2 2018-06-18 19:33:26,490 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:33:26,490 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744121_3297 2018-06-18 19:33:26,494 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:33:28,302 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:33:28,303 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:33:28,305 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@68788e96identifier=tcp://laptop-name:34849/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@4708f54c{da=com.datatorrent.bufferserver.internal.DataList$Block@4dfeb5cd{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=278, starting_window=5b28089000000001, ending_window=5b28089000000005, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@37bf072f[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000131 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:19005 Log Contents: 2018-06-18 19:51:04,191 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:51:05,480 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000131/tmp as the basepath for spooling. 2018-06-18 19:51:05,484 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:38383 2018-06-18 19:51:06,589 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:51:06,707 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:51:06,777 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:51:07,030 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:38383/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:51:08,726 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:51:08,729 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:51:08,733 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@6757e52fidentifier=tcp://laptop-name:38383/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@580027d0{da=com.datatorrent.bufferserver.internal.DataList$Block@7de5f17f{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=12, starting_window=5b28089000000001, ending_window=5b28089000000001, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@af4db39[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000098 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20915 Log Contents: 2018-06-18 19:46:00,533 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:46:01,744 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000098/tmp as the basepath for spooling. 2018-06-18 19:46:01,749 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:33849 2018-06-18 19:46:02,838 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:46:02,930 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:46:03,004 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000098/tmp/chkp2996010530901334706 as the basepath for checkpointing. 2018-06-18 19:46:03,016 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:46:03,044 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:33849/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:46:03,145 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@69bebfcd for node 2 2018-06-18 19:46:03,268 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:46:03,268 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:46:04,961 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:46:04,964 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:46:04,970 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@31dad35eidentifier=tcp://laptop-name:33849/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@12c02690{da=com.datatorrent.bufferserver.internal.DataList$Block@6ffac94f{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1574, starting_window=5b28089000000001, ending_window=5b2808900000000f, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@a0d7035[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000073 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20913 Log Contents: 2018-06-18 19:42:12,652 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:42:13,810 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000073/tmp as the basepath for spooling. 2018-06-18 19:42:13,814 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:46099 2018-06-18 19:42:14,864 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:42:14,953 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:42:14,984 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:46099/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:42:15,040 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000073/tmp/chkp3935824285106309602 as the basepath for checkpointing. 2018-06-18 19:42:15,066 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:42:15,250 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@23fe52df for node 2 2018-06-18 19:42:15,270 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:42:15,271 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:42:16,973 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:42:16,975 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:42:16,980 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@1a1b9eaidentifier=tcp://laptop-name:46099/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@21de570f{da=com.datatorrent.bufferserver.internal.DataList$Block@3d6433d3{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=5534, starting_window=5b28089000000001, ending_window=5b2808900000001f, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@7d1eea[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000040 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23945 Log Contents: 2018-06-18 19:37:11,841 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:37:13,054 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000040/tmp as the basepath for spooling. 2018-06-18 19:37:13,058 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:43405 2018-06-18 19:37:14,130 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:37:14,218 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:37:14,292 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000040/tmp/chkp6122270518390231993 as the basepath for checkpointing. 2018-06-18 19:37:14,295 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:37:14,382 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:43405/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:37:14,416 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@1e3c9f96 for node 2 2018-06-18 19:37:14,427 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:37:14,428 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744598_3774 2018-06-18 19:37:14,430 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:37:14,431 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:37:16,242 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:37:16,252 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@33bfaf17identifier=tcp://laptop-name:43405/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7c69cde5{da=com.datatorrent.bufferserver.internal.DataList$Block@db706f6{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=127, starting_window=5b28089000000001, ending_window=5b28089000000003, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@19aa4786[identifier=2.out.1] 2018-06-18 19:37:16,255 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000007 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20914 Log Contents: 2018-06-18 19:32:10,667 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:32:11,859 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000007/tmp as the basepath for spooling. 2018-06-18 19:32:11,863 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:46723 2018-06-18 19:32:12,962 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:32:13,040 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:32:13,110 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000007/tmp/chkp2651819136991822222 as the basepath for checkpointing. 2018-06-18 19:32:13,124 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:32:13,244 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@7dc37a6d for node 2 2018-06-18 19:32:13,345 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:46723/2.out.1, windowId=5b2808900000000d, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:32:13,378 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:32:13,379 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:32:15,078 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:32:15,081 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:32:15,086 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@2e1eedf6identifier=tcp://laptop-name:46723/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3779448{da=com.datatorrent.bufferserver.internal.DataList$Block@e0964f4{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=2348, starting_window=5b28089000000001, ending_window=5b28089000000013, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@25f4001f[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000123 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23944 Log Contents: 2018-06-18 19:49:51,212 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:49:52,420 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000123/tmp as the basepath for spooling. 2018-06-18 19:49:52,427 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:45675 2018-06-18 19:49:53,520 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:49:53,622 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:49:53,687 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:45675/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:49:53,710 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000123/tmp/chkp4238117693415048007 as the basepath for checkpointing. 2018-06-18 19:49:53,719 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:49:53,837 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@55cc70d5 for node 2 2018-06-18 19:49:53,846 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:49:53,847 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073746191_5367 2018-06-18 19:49:53,849 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:49:53,850 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:49:55,649 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:49:55,651 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:49:55,653 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@e65807fidentifier=tcp://laptop-name:45675/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@75df00a{da=com.datatorrent.bufferserver.internal.DataList$Block@704ece63{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=688, starting_window=5b28089000000001, ending_window=5b28089000000009, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@3ac4a368[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000090 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23947 Log Contents: 2018-06-18 19:44:47,602 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:44:48,792 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000090/tmp as the basepath for spooling. 2018-06-18 19:44:48,797 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:44399 2018-06-18 19:44:49,870 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:44:49,962 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:44:50,036 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000090/tmp/chkp3956046276281456170 as the basepath for checkpointing. 2018-06-18 19:44:50,047 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:44:50,163 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@77c85716 for node 2 2018-06-18 19:44:50,164 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:44:50,165 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745553_4729 2018-06-18 19:44:50,167 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:44:50,167 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:44:50,187 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:44399/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:44:51,995 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:44:51,997 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:44:52,001 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@789e365bidentifier=tcp://laptop-name:44399/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7456fb9a{da=com.datatorrent.bufferserver.internal.DataList$Block@4b2329ae{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1090, starting_window=5b28089000000001, ending_window=5b2808900000000c, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@312325ad[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000065 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20914 Log Contents: 2018-06-18 19:40:59,725 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:41:00,892 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000065/tmp as the basepath for spooling. 2018-06-18 19:41:00,895 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:35635 2018-06-18 19:41:01,960 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:41:02,037 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:41:02,093 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:35635/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:41:02,115 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000065/tmp/chkp525029383686648144 as the basepath for checkpointing. 2018-06-18 19:41:02,125 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:41:02,243 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@4b102881 for node 2 2018-06-18 19:41:02,356 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:41:02,356 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:41:04,066 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:41:04,068 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:41:04,073 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@69bdc29fidentifier=tcp://laptop-name:35635/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3cd626f3{da=com.datatorrent.bufferserver.internal.DataList$Block@66bb5f4b{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=947, starting_window=5b28089000000001, ending_window=5b2808900000000b, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@16a58456[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000032 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:19004 Log Contents: 2018-06-18 19:35:58,920 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:36:00,092 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000032/tmp as the basepath for spooling. 2018-06-18 19:36:00,096 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:44103 2018-06-18 19:36:01,185 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:36:01,278 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:36:01,367 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:36:01,441 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:44103/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:36:03,308 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:36:03,311 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:36:03,316 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@53dc9886identifier=tcp://laptop-name:44103/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6484767a{da=com.datatorrent.bufferserver.internal.DataList$Block@7485a6{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=12, starting_window=5b28089000000001, ending_window=5b28089000000001, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@34966eab[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000115 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20913 Log Contents: 2018-06-18 19:48:37,395 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:48:38,550 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000115/tmp as the basepath for spooling. 2018-06-18 19:48:38,554 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:46411 2018-06-18 19:48:39,647 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:48:39,777 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:48:39,799 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:46411/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:48:39,864 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000115/tmp/chkp3637364073054611404 as the basepath for checkpointing. 2018-06-18 19:48:39,869 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:48:39,994 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@79ef330d for node 2 2018-06-18 19:48:40,109 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:48:40,109 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:48:41,801 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:48:41,803 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:48:41,807 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@355a1ffidentifier=tcp://laptop-name:46411/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@879cdc5{da=com.datatorrent.bufferserver.internal.DataList$Block@39e8b867{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=127, starting_window=5b28089000000001, ending_window=5b28089000000003, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@5bb7e28e[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000082 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:24018 Log Contents: 2018-06-18 19:43:34,693 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:43:35,877 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000082/tmp as the basepath for spooling. 2018-06-18 19:43:35,881 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:36325 2018-06-18 19:43:36,948 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:43:37,019 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:43:37,093 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000082/tmp/chkp2486546742207904778 as the basepath for checkpointing. 2018-06-18 19:43:37,103 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:43:37,218 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@24cfbfa9 for node 2 2018-06-18 19:43:37,219 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream com.google.protobuf.InvalidProtocolBufferException at org.apache.hadoop.hdfs.protocol.proto.DataTransferProtos$BlockOpResponseProto.(DataTransferProtos.java:20288) at org.apache.hadoop.hdfs.protocol.proto.DataTransferProtos$BlockOpResponseProto.(DataTransferProtos.java:20182) at org.apache.hadoop.hdfs.protocol.proto.DataTransferProtos$BlockOpResponseProto$1.parsePartialFrom(DataTransferProtos.java:20312) at org.apache.hadoop.hdfs.protocol.proto.DataTransferProtos$BlockOpResponseProto$1.parsePartialFrom(DataTransferProtos.java:20307) at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200) at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217) at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223) at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49) at org.apache.hadoop.hdfs.protocol.proto.DataTransferProtos$BlockOpResponseProto.parseFrom(DataTransferProtos.java:20721) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1342) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:43:37,220 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745400_4576 2018-06-18 19:43:37,221 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:43:37,222 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:43:37,364 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:36325/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:43:39,037 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:43:39,039 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:43:39,044 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@446242bfidentifier=tcp://laptop-name:36325/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1f4fdfef{da=com.datatorrent.bufferserver.internal.DataList$Block@7a2fdc54{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=1242, starting_window=5b28089000000001, ending_window=5b2808900000000d, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@2a136fdf[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000057 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:22165 Log Contents: 2018-06-18 19:39:46,767 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:39:47,930 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000057/tmp as the basepath for spooling. 2018-06-18 19:39:47,934 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:42947 2018-06-18 19:39:49,031 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:39:49,152 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:39:49,175 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:42947/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:39:49,231 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000057/tmp/chkp326776391522928211 as the basepath for checkpointing. 2018-06-18 19:39:49,235 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:39:49,358 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@61cca5a5 for node 2 2018-06-18 19:39:49,362 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:39:49,363 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744925_4101 2018-06-18 19:39:49,366 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.InterruptedIOException: Call interrupted at org.apache.hadoop.ipc.Client.call(Client.java:1469) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:39:51,176 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:39:51,178 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:39:51,183 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@35e2d78fidentifier=tcp://laptop-name:42947/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@261c7303{da=com.datatorrent.bufferserver.internal.DataList$Block@44f6a4a0{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=127, starting_window=5b28089000000001, ending_window=5b28089000000003, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@5eb9b117[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000024 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:20915 Log Contents: 2018-06-18 19:34:45,861 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:34:47,063 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000024/tmp as the basepath for spooling. 2018-06-18 19:34:47,066 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:36075 2018-06-18 19:34:48,118 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:34:48,217 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:34:48,292 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000024/tmp/chkp1932857748495436019 as the basepath for checkpointing. 2018-06-18 19:34:48,302 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:34:48,421 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@4ba054d0 for node 2 2018-06-18 19:34:48,560 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:36075/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:34:48,585 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:508) 2018-06-18 19:34:48,585 WARN org.apache.hadoop.hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573) 2018-06-18 19:34:50,250 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:34:50,255 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:34:50,259 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@114f42dcidentifier=tcp://laptop-name:36075/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@37d247a1{da=com.datatorrent.bufferserver.internal.DataList$Block@3191b276{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=688, starting_window=5b28089000000001, ending_window=5b28089000000009, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@36944113[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000140 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:19005 Log Contents: 2018-06-18 19:52:26,449 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:52:27,921 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000140/tmp as the basepath for spooling. 2018-06-18 19:52:27,925 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:41325 2018-06-18 19:52:29,023 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:52:29,134 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:52:29,267 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:52:29,403 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41325/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:52:31,161 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:52:31,164 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:52:31,168 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@f7119c6identifier=tcp://laptop-name:41325/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@18cd61dc{da=com.datatorrent.bufferserver.internal.DataList$Block@3465b367{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=12, starting_window=5b28089000000001, ending_window=5b28089000000001, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@163f13f7[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000107 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23945 Log Contents: 2018-06-18 19:47:22,788 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:47:24,115 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000107/tmp as the basepath for spooling. 2018-06-18 19:47:24,119 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:41371 2018-06-18 19:47:25,189 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:47:25,406 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:41371/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:47:25,413 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:47:25,544 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000107/tmp/chkp6215683204979041972 as the basepath for checkpointing. 2018-06-18 19:47:25,549 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:47:25,674 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@79d893f4 for node 2 2018-06-18 19:47:25,702 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:47:25,703 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745879_5055 2018-06-18 19:47:25,705 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:47:25,706 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:47:27,442 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:47:27,443 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:47:27,446 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@b541268identifier=tcp://laptop-name:41371/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@25463024{da=com.datatorrent.bufferserver.internal.DataList$Block@5e807829{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=127, starting_window=5b28089000000001, ending_window=5b28089000000003, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@781adff9[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000074 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23945 Log Contents: 2018-06-18 19:42:21,717 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:42:22,911 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000074/tmp as the basepath for spooling. 2018-06-18 19:42:22,915 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:44077 2018-06-18 19:42:24,012 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:42:24,043 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:44077/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:42:24,123 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:42:24,184 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000074/tmp/chkp5812532368567757459 as the basepath for checkpointing. 2018-06-18 19:42:24,197 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:42:24,312 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@24bc158a for node 2 2018-06-18 19:42:24,318 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:42:24,319 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073745248_4424 2018-06-18 19:42:24,321 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:42:24,322 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:42:26,153 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:42:26,157 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:42:26,161 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@2ce3cdacidentifier=tcp://laptop-name:44077/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2002d950{da=com.datatorrent.bufferserver.internal.DataList$Block@3c3502c{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=947, starting_window=5b28089000000001, ending_window=5b2808900000000b, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@4b716a27[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000049 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23944 Log Contents: 2018-06-18 19:38:33,875 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:38:35,036 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000049/tmp as the basepath for spooling. 2018-06-18 19:38:35,040 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:33053 2018-06-18 19:38:36,133 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:38:36,253 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:38:36,319 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:33053/2.out.1, windowId=5b28089000000023, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:38:36,326 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000049/tmp/chkp495260638084328662 as the basepath for checkpointing. 2018-06-18 19:38:36,333 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:38:36,458 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@50d8d21c for node 2 2018-06-18 19:38:36,469 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:38:36,470 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744771_3947 2018-06-18 19:38:36,472 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:38:36,473 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:38:38,276 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:38:38,278 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:38:38,283 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@44101b68identifier=tcp://laptop-name:33053/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@341e1af2{da=com.datatorrent.bufferserver.internal.DataList$Block@671abd0{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=127, starting_window=5b28089000000001, ending_window=5b28089000000003, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@4d245774[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout Container: container_1529349239295_0005_01_000016 on localhost_40317 ====================================================================== LogType:apex.log Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:23946 Log Contents: 2018-06-18 19:33:32,945 INFO com.datatorrent.stram.engine.StreamingContainer: Child starting with classpath: ./jetty-servlet-8.1.10.v20130312.jar:./aws-java-sdk-core-1.10.73.jar:./jetty-http-8.1.10.v20130312.jar:./kryo-2.24.0.jar:./commons-beanutils-1.9.2.jar:./minlog-1.2.jar:./commons-collections-3.2.1.jar:./jersey-core-1.9.jar:./bval-core-0.5.jar:./validation-api-1.1.0.Final.jar:./jetty-server-8.1.10.v20130312.jar:./apex-bufferserver-3.7.0.jar:./xbean-asm5-shaded-4.3.jar:./myapexapp-1.0-SNAPSHOT.jar:./jackson-annotations-2.5.0.jar:./geronimo-jms_1.1_spec-1.1.1.jar:./jctools-core-1.1.jar:./jetty-io-8.1.10.v20130312.jar:./jackson-mapper-asl-1.9.13.jar:./httpclient-4.3.6.jar:./jetty-security-8.1.10.v20130312.jar:./jooq-3.6.4.jar:./apex-shaded-ning19-1.0.0.jar:./netlet-1.3.2.jar:./jersey-client-1.9.jar:./jackson-core-2.5.4.jar:./hawtbuf-1.9.jar:./apex-engine-3.7.0.jar:./activemq-client-5.8.0.jar:./aws-java-sdk-kms-1.10.73.jar:./joda-time-2.9.1.jar:./jms-api-1.1-rev-1.jar:./jackson-dataformat-cbor-2.5.3.jar:./malhar-library-3.8.0.jar:./mbassador-1.1.9.jar:./apex-common-3.7.0.jar:./bval-jsr303-0.5.jar:./httpcore-4.3.3.jar:./named-regexp-0.2.3.jar:./jetty-continuation-8.1.10.v20130312.jar:./fastutil-7.0.6.jar:./jackson-databind-2.5.4.jar:./commons-compiler-2.7.8.jar:./jetty-websocket-8.1.10.v20130312.jar:./aws-java-sdk-s3-1.10.73.jar:./javax.mail-1.5.0.jar:./geronimo-j2ee-management_1.1_spec-1.0.1.jar:./activation-1.1.jar:./apex-api-3.7.0.jar:./jackson-core-asl-1.9.13.jar:./commons-lang3-3.1.jar:./slf4j-api-1.7.5.jar:./commons-logging-1.1.1.jar:./jetty-util-8.1.10.v20130312.jar:/etc/hadoop/conf:/usr/lib/hadoop/hadoop-annotations-2.7.3.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-nfs-2.7.3.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.7.3-tests.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-common-2.7.3.jar:/usr/lib/hadoop/hadoop-auth-2.7.3.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-tests.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/hadoop-aws.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.3.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.3.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/joda-time-2.9.9.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/aws-java-sdk-1.7.4.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.3.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.3.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:. 2018-06-18 19:33:34,106 INFO com.datatorrent.bufferserver.storage.DiskStorage: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000016/tmp as the basepath for spooling. 2018-06-18 19:33:34,110 INFO com.datatorrent.bufferserver.server.Server: Server started listening at /0:0:0:0:0:0:0:0:40703 2018-06-18 19:33:35,195 INFO com.datatorrent.stram.engine.StreamingContainer: Deploy request: [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] 2018-06-18 19:33:35,318 INFO com.datatorrent.bufferserver.server.Server: Received publisher request: PublishRequestTuple{version=1.0, identifier=2.out.1, windowId=ffffffffffffffff} 2018-06-18 19:33:35,388 INFO com.datatorrent.common.util.AsyncFSStorageAgent: using /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/apex/appcache/application_1529349239295_0005/container_1529349239295_0005_01_000016/tmp/chkp3958676527904179191 as the basepath for checkpointing. 2018-06-18 19:33:35,395 ERROR com.datatorrent.stram.engine.StreamingContainer: Operator set [OperatorDeployInfo[id=2,name=Aggregator,type=GENERIC,checkpoint={ffffffffffffffff, 0, 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=aggregate,sourceNodeId=1,sourcePortName=out,locality=,partitionMask=0,partitionKeys=]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=writer,bufferServer=laptop-name]]]] stopped running due to an exception. aptest.FailureGenerator$ExpectedException at aptest.FailureGenerator.failOrNot(FailureGenerator.java:17) at aptest.Aggregator$1.process(Aggregator.java:27) at aptest.Aggregator$1.process(Aggregator.java:24) at com.datatorrent.api.DefaultInputPort.put(DefaultInputPort.java:81) at com.datatorrent.stram.stream.BufferServerSubscriber$BufferReservoir.sweep(BufferServerSubscriber.java:288) at com.datatorrent.stram.engine.GenericNode.run(GenericNode.java:269) at com.datatorrent.stram.engine.StreamingContainer$2.run(StreamingContainer.java:1429) 2018-06-18 19:33:35,487 INFO com.datatorrent.bufferserver.server.Server: Received subscriber request: SubscribeRequestTuple{version=1.0, identifier=tcp://laptop-name:40703/2.out.1, windowId=5b28089000000013, type=writer/3.input, upstreamIdentifier=2.out.1, mask=0, partitions=null, bufferSize=1024} 2018-06-18 19:33:35,515 WARN com.datatorrent.stram.engine.Node: Shutting down executor service java.util.concurrent.Executors$FinalizableDelegatedExecutorService@705fb39d for node 2 2018-06-18 19:33:35,539 INFO org.apache.hadoop.hdfs.DFSClient: Exception in createBlockOutputStream java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.send(Sender.java:82) at org.apache.hadoop.hdfs.protocol.datatransfer.Sender.writeBlock(Sender.java:161) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:33:35,540 INFO org.apache.hadoop.hdfs.DFSClient: Abandoning BP-2135041833-172.17.0.3-1526202085113:blk_1073744140_3316 2018-06-18 19:33:35,542 WARN org.apache.hadoop.ipc.Client: interrupted waiting to send rpc request to server java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) 2018-06-18 19:33:35,543 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: java.lang.InterruptedException at org.apache.hadoop.ipc.Client.call(Client.java:1460) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy10.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.abandonBlock(ClientNamenodeProtocolTranslatorPB.java:395) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.abandonBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) Caused by: java.lang.InterruptedException at java.util.concurrent.FutureTask.awaitDone(FutureTask.java:404) at java.util.concurrent.FutureTask.get(FutureTask.java:191) at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(Client.java:1059) at org.apache.hadoop.ipc.Client.call(Client.java:1454) ... 13 more 2018-06-18 19:33:37,346 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy request: [2] 2018-06-18 19:33:37,353 INFO com.datatorrent.stram.engine.StreamingContainer: Undeploy complete. 2018-06-18 19:33:37,356 INFO com.datatorrent.bufferserver.server.Server: Removing ln LogicalNode@313bdd42identifier=tcp://laptop-name:40703/2.out.1, upstream=2.out.1, group=writer/3.input, partitions=[], iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5d42c094{da=com.datatorrent.bufferserver.internal.DataList$Block@7e9efbcd{identifier=2.out.1, data=67108864, readingOffset=0, writingOffset=465, starting_window=5b28089000000001, ending_window=5b28089000000007, refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl DataList@657d8291[identifier=2.out.1] End of LogType:apex.log LogType:stderr Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Mon Jun 18 19:53:14 +0000 2018 LogLength:0 Log Contents: End of LogType:stdout