-
Compiled by bigtop
-
Resource Description:
- Software and code mirroring
- Package Images
- github access
-
Compilation-related knowledge
- technical knowledge
- bigtop compilation process and experience summary
- Difficulty and approximate compilation time for each module (purely compilation time, not including file downloads and troubleshooting time)
-
Resource Description:
-
centos real machine build branch-3.2
- Hardware Description:
-
Compilation steps
- Download the code and switch branches
- Domestic Mirror Configuration
-
Preparation of the basic environment
- Dependent Environment Installation (yum)
- Dependent Environment Configuration
- Domestic Mirror Configuration | Software Global Configuration
-
Modify the source code of some components
- Download component source code
-
Modify the code
- Modifying hadoop code
- Modify flink code
- Modify tez code
- Modify zeppelin code
- Modifying spark source code
- Repackaging component source code
-
Overall compilation [not recommended]
- Integral compilation command (executed in the bigtop root directory)
-
Step-by-step component-by-component compilation
- Compile some peripheral dependent components first:
- zookeeper (21 subprojects)
- hadoop (111 subprojects)
- hbase
- hive
- phoenix
- tez
- spark (29 subprojects)
- kafka
- flink (207 subprojects)
- solr
- zeppelin
- Compiled Disk Occupancy
-
branch-3.2 Compile Time for Each Component [Pure compile time, excluding the time to download related packages].
- hbase
- Phoenix
- tez
- spark
- flink
- docker-centos Compile
-
bigtop source code analysis
Compiled by bigtop
Resource Description:
Software and code mirroring
-
apache code and software releases resources download
- History packages (almost all versions):/dist/
- History packages (recent versions):/
- Domestic history packages (almost all versions):/apache/
- Domestic history package (most recent version):/apache/
- Tencent Cloud:/
- Tsinghua University Mirror Station:/
-
Domestic mirrors of development languages and related software installation packages
- gradle
- /macports/distfiles/ -> /macports/distfiles/gradle/
- /gradle/ -> /gradle/
- nodejs:
- /dist/
- /nodejs-release/
- /nodejs-release/
- /nodejs-release/
- npm
- Official:/npm/-/npm-6.9.
- Aliyun:/macports/distfiles/
- gradle
Package Images
-
jar package [maven image]:
- node packages [npm, yarn mirrors]
- Domestic mirrors:
- Use:
- npm config set registry
- yarn config set registry
- bower config set registry
- bower:
- Usage: 1) you can directly download the package from the npm mirror and install it 2) you can git clone github release branch, install it
- Understand the reference:/p/bower?hmsr=aladdin1e1
- Foreign address:
- Domestic access to mirror addresses:
- bower built-in commands:
- View git branches: git ls-remote --tags --heads/components/
github access
Modify hosts file to access github:https:/// By querying the domain name, find the corresponding ip, fill in the domain name and ip to hosts
140.82.112.4
199.232.69.194
185.199.108.133
185.199.109.133
185.199.110.133
185.199.111.133
Compilation-related knowledge
technical knowledge
-
What you need to know about maven command parameters:
- Various skips
- -=true :The RAT plugin is used to check for license issues in the source code, ensuring that all files have proper license declarations; this parameter skips the license checking
- -=true : Skips the test compilation and test execution phases. That is, it will not only skip test execution, it will alsoSkip compilation of test code。
- -DskipTests: Maven skips the execution of tests, but theCan compile test code。
- Some components (e.g. most sub-projects of flink) need to compile test code, and some compilations don't need to compile test code (relevant).
- The maven log prints when.
-Dorg.=true -Dorg.="yyyy-MM-dd HH:mm:
- mavrn compilation fails, deal with the cause of failure (such as network failure), continue to compile from the last failed module.
mvn <args> -rf: xxxx
The module name is the name of the module that failed last time.
- Various skips
bigtop compilation process and experience summary
-
Catalog description
- : defines the version of each component
- : defines the overall process of compiling components with gradle
- Component configuration directory (packages for short):bigtop\bigtop-packages\src\common\XXX\, where xxx is the component name, the
- do-component-build: Generally the maven command is wrapped as a gradle command here
- Component source package download directory (dl for short):bigtop\dl\xxx, where xxx component name
- Component build directory (build for short): bigtop/build/xxx where xxx component name
- rpm/SOURCES: The files in the component configuration directory are copied here, and the source packages in the component source download directory are also copied here.
- rpm/BUILD: the component source code will be extracted here and compiled; (note that this directory will be regenerated each time it is compiled, so it is not valid to change this directory file before compilation)
- rpm/RPMS: Where the rpm package is stored after the component is packaged; if there are files in this location, gradle will skip compilation
- rpm/SRPMS: srpm package storage location
- tar: store the source tarball
-
Component download and compilation process: Recommended detailed reading
- Document flow:dl/spark-xxx [|zip|tgz] -> build/spark/rpm/SOURCES/spark/[&sourcecode] -> build/spark/rpm/BUILD/spark/spark-xxx[sourcecode] -> build/spark/rpm/RPMS/[] -> output/spark/ []
-
- Download the component source package to the dl directory, if the source package already exists, it will not be re-downloaded (you can use this rule to download it beforehand to save time, or change the source code and repackage it)
-
- Extract the source zip in the dl directory to the build directory; apply the configuration information from the packages directory to the build directory. (This step will be repeated as long as the compilation is not successful, so modifying the decompressed code is invalid and will be overwritten by the decompression)
-
- Compile in the build directory; if you find that the configuration does not take effect, you need to delete the build directory and re-execute the xxx-rpm gradle command; e.g., when the configuration of npm, yarn, and bower mirrors does not take effect
-
Compilation experience
- package directory and configure it before compiling, especially changing it to a domestic mirror.
- The source code package is downloaded in advance and put into the dl, part of the compressed package need to change the code, repackaging
- If the front-end project fails to compile, such as tez-ui, you can locate the directory and execute the node package installation command, such as npm install, yarn install, bower install
- Note: Do not use global npm, yarn, install, locate the localized one and follow it with the full directory executable
- If maven compilation fails, deal with the cause of the failure, prioritize the use of -rf: xxx, for this module and subsequent modules to compile, and so on maven in the overall success of the compilation, and then use gradle to compile the entire component
- Note that maven specific commands can be obtained by looking at the gradle compilation logs in the
- When gradle compiles, keep a log so that when it fails, you can see why and search for maven, and front-end compilation commands and localized npm, yarn, and bower locations.
-
parallel compilation: bigtop 3.2.0 doesn't seem to support parallel compilation, 3.3.0 does.
--parallel --parallel-threads=N
-
logfor easy access to mvn commands and problem troubleshooting:
- gradle compilation:
./gradlew tez-rpm -PparentDir=/usr/bigtop >> 2>>
- maven compile (please refer to the mvn command in the gradle log):
mvn package install -DskipTests -Dorg.=true -Dorg.="yyyy-MM-dd HH:mm:" >> /soft/code/bigtop/ 2>> /soft/code/bigtop/
- Front-end compilation:
npm install
yarn install
bower install --allow-root
- gradle compilation:
Difficulty and approximate compilation time for each module (purely compilation time, not including file downloads and troubleshooting time)
- Component Name Compilation Difficulty (5 stars) Pure Compilation Time Consumption
- hadoop *** (1 hour) [success, no logs]
- zookeeper * (30 minutes) [success, no logs]
- hive * (1 hour) [success]
- hbase * (30 minutes) [success]
- phoenix* (30 minutes) [Success]
- tez *** (1 hour) [Success]
- bigtop-ambari-mpack * (5 minutes) [success]
- bigtop-groovy* (5 seconds) [success]
- bigtop-jsvc * (5 seconds) [success]
- bigtop-select * (5 seconds) [success]
- bigtop-utils * (5 seconds) [success]
- ranger (none)
- solr *** (1 hour) [success]
- kafka *** (30 minutes) [success]
- spark ***** (150 minutes) [Successful, removing local R environment, but not sparkR]
- flink [failed]
- zeppelin []
centos real machine build branch-3.2
- Reference (it is highly recommended to read through the following articles first for understanding)
- BigTop 3.2.0 Big Data Component Compilation - Basic Environment Preparation:/m0_48319997/article/details/128037302
- BigTop 3.2.0 Big Data Component Compilation - Component Compilation/m0_48319997/article/details/128101667
- Compilation guide for the ambari2.8.0+bigtop3.2.0 distribution of the Big Data Platform:/m0_48319997/article/details/130046296
- Ambari+Bigtop Big Data Platform Installation and Deployment Guide (Centos7) I:/m0_48319997/article/details/130069050
- Ambari+Bigtop Big Data Platform Installation and Deployment Guide (Centos7) II:/m0_48319997/article/details/130233526
Hardware Description:
- Lenovo Notebook W540: 32G RAM CPU 4-core 8-thread SSD 1T
- Compile environment: vmware centos 7.9 2-core 4 threads (accounting for half of the overall physical machine), memory 12G, hard disk 500G (recommended at least 100G, see the compilation of disk occupancy)
- Note: centos virtual machine files are placed on the SSD partition for faster response time
Compilation steps
Downloading code and switching branches
git clone /piaolingzxh/
cd bigtop/
git checkout -b branch-3.2 origin/branch-3.2
- Description:
- The time of this compilation: 2024-7-18~29 days
- The branch-3.2 branch is constantly committing code, and the code you get may be newer than mine; when it compiles, it may have fixed some of the problems, but it may also have new problems.
- Currently branch-3.2 latest commit is 2024-06-02. git hashcode is 3ffe75e05e8428f353a018aafc9c003be72ca6ff
- The code for branch3.2 actually contains the code for the 3.2.0 and 3.2.1releases, i.e. it's more up to date than the code for those two tags. Some of the commits are newer than the master branch.
Domestic Mirror Configuration
# Modify bigtop/config There are two changes to be made (change both or change version=bigtop3. corresponding)
#1. Modify the mirror source to the domestic mirror source 103, 104 lines
APACHE_MIRROR = "/apache"
APACHE_ARCHIVE = "/apache"
#2. Uncomment bigtop-select component Delete lines 273, 281
Note: Some of the material may not be available on aliyun, so you need to change back to the original address.
APACHE_MIRROR = ""
APACHE_ARCHIVE = "/dist"
Preparation of the basic environment
Dependent Environment Installation (yum)
#Install the dependencies required for component compilation
#dependencies
yum -y install fuse-devel cmake cmake3 lzo-devel openssl-devel protobuf* cyrus-*
#cmakeThe default version is changed tocmake3
mv /usr/bin/cmake /usr/bin/
ln -s /usr/bin/cmake3 /usr/bin/cmake
#dependencies
yum -y install cppunit-devel
#dependencies
yum -y install R* harfbuzz-devel fribidi-devel libcurl-devel libxml2-devel freetype-devel libpng-devel libtiff-devel libjpeg-turbo-devel pandoc* libgit2-devel
# myRscriptNot installed,both (... and...)sparkCompilation skips the localRlinguisticmake
Rscript -e "(c('knitr', 'rmarkdown', 'devtools', 'testthat', 'e1071', 'survival'), repos='/CRAN/')"
Rscript -e "(c('devtools'), repos='/CRAN/')"
Rscript -e "(c('evaluate'), repos='/CRAN/')"
# this localitymakeRThe environment reports the following error
package 'evaluate' is not available (for R version 3.6.0)
dependency 'evaluate' is not available for package 'knitr'
Rscript -e "(c('knitr'), repos='/CRAN/')"
Rscript -e "(c('evaluate'), repos='/CRAN/')"
Dependent Environment Configuration
- github-related hosts
140.82.112.4
199.232.69.194
185.199.108.133
185.199.109.133
185.199.110.133
185.199.111.133
Domestic Mirror Configuration | Software Global Configuration
- maven image configuration
- node image configuration: ~/.npmrc , npm config set registry
- yarn image configuration: ~/.yarnrc
- bower image configuration: ~/.bowerrc
# ~/.bowerrc
{
"directory": "bower_components",
"registry": "",
"analytics": false,
"resolvers": [
"bower-shrinkwrap-resolver-ext"
],
"strict-ssl": false
}
Modify the source code of some components
Download component source code
#1.Download first
./gradlew tez-download zeppelin-download flink-download spark-download
#2.Access to the download catalog
cd dl
#3.Unzip these.tar
tar -zxvf flink-1.15.
tar -zxvf apache-tez-0.10.
tar -zxvf zeppelin-0.10.
tar -zxvf spark-3.2.
Modify the code
Modifying hadoop code
-- dl/hadoop-3.3.6-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-ui/, add a node between yarn install and bower innstall as follows
<execution>
<phase>generate-resources</phase>
<id>bower install moment-timezone</id>
<configuration>
<arguments>install moment-timezone=/moment/#=0.5.1 --allow-root</arguments>
</configuration>
<goals>
<goal>bower</goal>
</goals>
</execution>
Modify flink code
1)node、npmVersion and related settings
vi flink-1.15.0/flink-runtime-web/
exist275classifier for objects in rows such as words nodeVersionchange intov12.22.1
exist276classifier for objects in rows such as words npmVersionchange into6.14.12
bigtop/dl/flink-1.15.3/flink-runtime-web/
<arguments>ci --cache-max=0 --no-save ${}</arguments> 此classifier for objects in rows such as words删除,更change into下一classifier for objects in rows such as words,The attention is gone.ci
<arguments> install -g --registry= --cache-max=0 --no-save</arguments>
2)Comment out the test code that reported the error
cd dl/flink-1.15.3/flink-formats/flink-avro-confluent-registry/src/test/java/org/apache/flink/formats/avro/registry/confluent/
mv CachedSchemaCoderProviderTest.java1
mv RegistryAvroFormatFactoryTest.java1
cd dl/flink-1.15.3/flink-end-to-end-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/
mv SQLClientSchemaRegistryITCase.java1
3)Problems with packages not downloading,manual download,and install it in the local repository()
wget /maven/io/confluent/common-config/6.2.2/common-config-6.2. ./
wget /maven/io/confluent/common-utils/6.2.2/common-utils-6.2. ./
wget /maven/io/confluent/kafka-avro-serializer/6.2.2/kafka-avro-serializer-6.2. ./
wget /maven/io/confluent/kafka-schema-registry-client/6.2.2/kafka-schema-registry-client-6.2. ./
# mountingjarPackage to local repository
mvn install:install-file -Dfile=/soft/ambari-develop/common-config-6.2. -DgroupId= -DartifactId=common-config -Dversion=6.2.2 -Dpackaging=jar
mvn install:install-file -Dfile=/soft/ambari-develop/common-utils-6.2. -DgroupId= -DartifactId=common-utils -Dversion=6.2.2 -Dpackaging=jar
mvn install:install-file -Dfile=/soft/ambari-develop/kafka-avro-serializer-6.2. -DgroupId= -DartifactId=kafka-avro-serializer -Dversion=6.2.2 -Dpackaging=jar
mvn install:install-file -Dfile=/soft/ambari-develop/kafka-schema-registry-client-6.2. -DgroupId= -DartifactId=kafka-schema-registry-client -Dversion=6.2.2 -Dpackaging=jar
npm install -g @angular/[email protected]
Modify tez code
vi apache-tez-0.10.1-src/tez-ui/
At line 37 allow-root-build is changed to --allow-root=true
bower domestic mirror configuration: change to
- Files involved:
- bigtop\bigtop-packages\src\common\ambari\
- bigtop\bigtop-packages\src\common\tez\
Modify zeppelin code
vi zeppelin-0.10.1/
Change to true at line 209
vi zeppelin-0.10.1/spark/
Change to /apache/spark/${}/${}.tgz at line 50
On line 53 change to /apache/spark/${}/${}-
vi zeppelin-0.10.1/rlang/
Change to /apache/spark/${}/${}.tgz at line 41
Change to /apache/spark/${}/${}- at line 44
vi zeppelin-0.10.1/flink/flink-scala-parent/
Change to /apache/flink/flink-${}/flink-${}-bin-scala_${}.tgz on line 45
Modifying spark source code
cd bigtop/dl
vim spark-3.2.3/dev/
#BUILD_COMMAND=("$MVN" clean package -DskipTests $@)
BUILD_COMMAND=("$MVN" clean package -DskipTests -Dorg.=true -Dorg.="yyyy-MM-dd HH:mm:" $@) #Add the print time parameter to this line
tar zcvf spark-3.2. spark-3.2.3
Repackaging component source code
- Note: The packaged zip should have the same name as the zip before unzipping (including the suffix)
tar zcvf flink-1.15. flink-1.15.3
tar zcvf apache-tez-0.10. apache-tez-0.10.1-src
tar zcvf zeppelin-0.10. zeppelin-0.10.1
tar zcvf spark-3.2. spark-3.2.3
Overall compilation [not recommended]
- It is expected to take more than N hours, and in the middle may report a variety of errors, it is recommended to compile one component at a time, and then compile the whole after all the components have been compiled successfully.
- compilation parameter
- allclean
- -PparentDir=/usr/bigtop : The default installation location of the rpm package, equivalent to the default installation path of ambari.
/usr/hdp
- -Dbuildwithdeps=true: when compiling, if the dependencies are not compiled, compile the dependencies first
- -PpkgSuffix:
- Bigtop version at compile time, similar to hdp version number, e.g. 3.2.0,3.2.1,3.3.0.
- This version number will be reflected in the name of the packaged file, so be sure to take this parameter with you when compiling, otherwise ambari will not be able to find the corresponding package after compilation.
- Example package name: hadoop_3_2_1-hdfs-namenode-3.3.6-1.el7.x86_64.rpm, where 3_2_1 represents the bigtop version number
- $component-rpm: component represents the component name, such as spark, hive, hbase and other big data components.
- allclean: Delete build/*, output/, /dist, please be careful, recompile will be extremely time consuming, and due to the wall, there may be errors that did not appear before. It is recommended to make a manual backup of these three directories before using this command.
emphasize in particular:
When compiling, be sure to add -PpkgSuffix, otherwise the package will not carry the bigtop version number, and ambari will not be able to recognize it when installing the big data components later on, and the specific error will be reported as follows:
Error getting repository data for BIGTOP-3.2.0, repository not found
No package found for hadoop_${stack_version}(expected name: hadoop_3_2_0)
Integral compilation command (executed in the bigtop root directory)
#
./gradlew bigtop-groovy-rpm bigtop-jsvc-rpm bigtop-select-rpm bigtop-utils-rpm flink-rpm hadoop-rpm hbase-rpm hive-rpm kafka-rpm solr-rpm spark-rpm tez-rpm zeppelin-rpm zookeeper-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/ 2>> logs/
Step-by-step component-by-component compilation
Compile some peripheral dependent components first:
- bigtop-ambari-mpack-rpm
- bigtop-groovy-rpm
- bigtop-jsvc-rpm
- bigtop-select-rpm
- bigtop-utils-rpm
./gradlew bigtop-ambari-mpack-rpm bigtop-groovy-rpm bigtop-jsvc-rpm bigtop-select-rpm bigtop-utils-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/ 2>> logs/
zookeeper (21 subprojects)
./gradlew zookeeper-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/ 2>> logs/
hadoop (111 subprojects)
- gradle compile command:
./gradlew hadoop-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/ 2>> logs/
- Example of maven compile command:
cd /soft/code/bigtop/build/hadoop/rpm/BUILD/hadoop-3.3.6-src
mvn -Pdist -Pnative -Psrc -Pyarn-ui -Dtar -=3.6.4 -=2.0 -DskipTests -DskipITs install -rf :hadoop-yarn-ui >> /soft/code/bigtop/logs/ 2>> /soft/code/bigtop/logs/
Problem points - solutions:
- 1) cmake3: see base environment configuration above
- 2) hadoop-yarn-ui compilation
- hadoop-yarn-ui: -> increase:bower install moment-timezone=/moment/#=0.5.1 --allow-root
- 3) bower to access, pay attention to make sure the network is smooth
concrete operation:
- 1) When hadoop-yarn-ui is compiled, test to see if it is connected to github
-
git ls-remote --tags --heads /moment/
You can use this command to test if you can access the github branch information properly - 2)compile-time,asmoment-timezone#0.5.1orember#2.8.0uninstallable,recompenseENORES component No version found that was able to satisfy
- Then add in hadoop-yarn-ui: -> before the bow install command:
bower install moment-timezone=/moment/#=0.5.1 --allow-root
, see hadoop code modification for details - Front-end command location:
- The location where the node, yarn, and bower commands are executed (based on logs): bigtop/build/hadoop/rpm/BUILD/hadoop-3.3.6-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-ui/ component/webapp
- Local node location: webapp/node/yarn/dist/bin/yarn install
- Local bower location: webapp/node_modules/bower/bin/bower, if compiled under root account, you need to add --allow-root
hbase
./gradlew hbase-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/ 2>> logs/
hive
./gradlew hive-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/ 2>> logs/
phoenix
./gradlew bigtop-ambari-mpack-rpm phoenix-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/ 2>> logs/
tez
./gradlew tez-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/ 2>> logs/
Problem Points|Solutions:
- 1) root build, apache-tez-0.10.2-src/tez-ui/ , at line 37 allow-root-build to --allow-root=true
- 2) bower mirroring: change into
- Documents involved:
- bigtop\bigtop-packages\src\common\ambari\
- bigtop\bigtop-packages\src\common\tez\
- Documents involved:
- 3) Certificate expired: codemirror#5.11.0 certificate has expired
- export BOWER_STRCT_SSL=false
- bower global configuration: "strct_ssl" = false ,see above for details
error message (computing):
- 1)bower ember-cli-shims#0.0.6 ECONNREFUSED Request to /packages/ember-cli-shims failed: connect ECONNREFUSED 108.160.169.186:443
- Solution: bower domestic mirror settings, see above
- 2)bower codemirror#5.11.0 CERT_HAS_EXPIRED Request to /packages/codemirror failed: certificate has expired
- Solution: Certificate expiration setting, see above
spark (29 subprojects)
- The gradle command
./gradlew spark-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/ 2>> logs/
- maven commands (please refer to the commands in the log):
mvn clean package -DskipTests -=/root/.ivy2 -=/root/.ivy2 -=/root -= -=file:///root/.m2/repository -=3.3.6 -=3.3.6 -Pyarn -Phadoop-3.2 -Phive -Phive-thriftserver -Psparkr -Pkubernetes -Pscala-2.12 -=27.0-jre -DskipTests -Dorg.=true -Dorg.="yyyy-MM-dd HH:mm:" >> /soft/code/bigtop/ 2>> /soft/code/bigtop/
- Description: gradle first invokes dev/ in the spark source decompression directory for mvn clean package; then performs maven install, which takes twice as long as a single pass. The total pure compilation time is around 150 minutes
Problem Points|Solutions
- 1) SparkR can be packaged normally, but local MAKE_R keeps failing, consider canceling local make_r
- Solution: vim bigtop\bigtop-packages\src\common\spark\do-component-build in . /dev/ --mvn mvn --r $BUILD_OPTS -DskipTests # Remove --r from this line.
- 2) scala, java mixed compilation time-consuming, maven compilation log, show time
- vim dl/spark-3.2.3/dev/
-
BUILD_COMMAND=("$MVN" clean package -DskipTests $@) Comment out the line
- BUILD_COMMAND=("$MVN" clean package -DskipTests -Dorg.=true -Dorg.="yyyy-MM-dd HH:mm:" $@) #Add the print time parameter to this line
- vim bigtop\bigtop-packages\src\common\spark\do-component-build
- mvn $BUILD_OPTS install -DskipTests=$SPARK_SKIP_TESTS -Dorg.=true -Dorg.="yyyy-MM-dd HH:mm:" #Add the print time parameter to this line
error message (computing)
- 1)Error in loadVignetteBuilder(pkgdir, TRUE) : vignette builder 'knitr' not found
- Solution: Remove the local make_r
kafka
./gradlew kafka-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/ 2>> logs/
point of attention:
- 1) Access:
- Solution: hosts file configuration (see above)
flink (207 subprojects)
-
./gradlew flink-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/ 2>> logs/
-
Note: This project cannot skip the test code compilation
Problem Points|Solutions: -
1) The relevant jar package cannot be downloaded.
-
Solution: Manually download and register to the local repository, see above for details
-
2) Errors in the test code:
-
- ndoe, npm failed to download.
- Downloading /dist/v16.13.2/node-v16.13. to /root/.m2/repository/com/github/eirslett/node/16.13.2/node-16.13.
- Downloading /npm/-/npm-8.1. to /root/.m2/repository/com/github/eirslett/npm/8.1.2/npm-8.1.
- Solution: Download manually and put it in the download directory (note the consistency of the suffix)
- mv /home/zxh/soft/ambari-develop/node-v16.13. /root/.m2/repository/com/github/eirslett/node/16.13.2/node-16.13.
- mv /home/zxh/soft/ambari-develop/npm-8.1. /root/.m2/repository/com/github/eirslett/npm/8.1.2/npm-8.1.
-
solr
./gradlew solr-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/ 2>> logs/
- Points of Attention:
- 1) This project is compiled using ivy, and it is highly recommended to configure domestic sources.
zeppelin
./gradlew zeppelin-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix -x test >> logs/ 2>> logs/
#downloadingzeppelinsource code package
./gradlew zeppelin-download
#decompression (in digital technology)zeppelinsource code (computing)
cd dl
tar -zxvf zeppelin-0.10.
#modificationspomfile
vi zeppelin-0.10.1/
exist209Replace the line with the followingtrue
vi zeppelin-0.10.1/spark/
exist50Replace the line with the following/apache/spark/${}/${}.tgz
exist53Replace the line with the following/apache/spark/${}/${}-
vi zeppelin-0.10.1/rlang/
exist41Replace the line with the following/apache/spark/${}/${}.tgz
exist44Replace the line with the following/apache/spark/${}/${}-
vi zeppelin-0.10.1/flink/flink-scala-parent/
exist45Replace the line with the following/apache/flink/flink-${}/flink-${}-bin-scala_${}.tgz
#repackagezeppelinsource code (computing)
tar -zcvf zeppelin-0.10. zeppelin-0.10.1
#compiling
./gradlew zeppelin-rpm -PparentDir=/usr/bigtop
concern:
[INFO] Downloading /npm/-/npm-6.9. to /root/.m2/repository/com/github/eirslett/npm/6.9.0/npm-6.9.
I/O exception () caught when processing request to {s}->:443: Connection reset
settle (a dispute):Go to Aliyun.、华为云downloading,Put it in the above target position,注意file重命名
/macports/distfiles/npm6/npm-6.9.
mvn -Dhadoop3.=3.3.6 -=0.7.1-incubating -Pscala-2.11 -Phadoop3 -Pbuild-distr -DskipTests clean package -pl '!beam,!hbase,!pig,!jdbc,!flink,!ignite,!kylin,!lens,!cassandra,!elasticsearch,!bigquery,!alluxio,!scio,!groovy,!sap,!java,!geode,!neo4j,!hazelcastjet,!submarine,!sparql,!mongodb,!ksql,!scalding' -am -rf :zeppelin-client-examples >> /soft/code/bigtop/logs/ 2>> /soft/code/bigtop/logs/
concern:Bower resolver not found: bower-shrinkwrap-resolver-ext
/jameBo/p/
Compiled Disk Occupancy
- Disk Occupancy:
- bigtop 11G
- ambari 2.1G
- ambari 6G
- Public Segment Disk Occupancy
- .m2: 12G
- .ant 2M
- .ivy2 200M
- .gradle 1.2g
- .nvm 75m
- .npm 400M
- .cache 400M
- Total disk usage: 34G+
branch-3.2 Compilation Time for Each Component [Pure compilation time, excluding the time to download related packages]
- Note: Without error
hbase
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Apache HBase 2.4.17:
[INFO]
[INFO] Apache HBase ....................................... SUCCESS [ 5.172 s]
[INFO] Apache HBase - Checkstyle .......................... SUCCESS [ 1.053 s]
[INFO] Apache HBase - Annotations ......................... SUCCESS [ 0.786 s]
[INFO] Apache HBase - Build Configuration ................. SUCCESS [ 0.313 s]
[INFO] Apache HBase - Logging ............................. SUCCESS [ 1.018 s]
[INFO] Apache HBase - Shaded Protocol ..................... SUCCESS [ 40.242 s]
[INFO] Apache HBase - Common .............................. SUCCESS [ 10.561 s]
[INFO] Apache HBase - Metrics API ......................... SUCCESS [ 6.383 s]
[INFO] Apache HBase - Hadoop Compatibility ................ SUCCESS [ 10.353 s]
[INFO] Apache HBase - Metrics Implementation .............. SUCCESS [ 6.360 s]
[INFO] Apache HBase - Hadoop Two Compatibility ............ SUCCESS [ 8.477 s]
[INFO] Apache HBase - Protocol ............................ SUCCESS [ 9.247 s]
[INFO] Apache HBase - Client .............................. SUCCESS [ 8.566 s]
[INFO] Apache HBase - Zookeeper ........................... SUCCESS [ 7.584 s]
[INFO] Apache HBase - Replication ......................... SUCCESS [ 7.105 s]
[INFO] Apache HBase - Resource Bundle ..................... SUCCESS [ 0.327 s]
[INFO] Apache HBase - HTTP ................................ SUCCESS [ 9.274 s]
[INFO] Apache HBase - Asynchronous FileSystem ............. SUCCESS [ 9.537 s]
[INFO] Apache HBase - Procedure ........................... SUCCESS [ 6.110 s]
[INFO] Apache HBase - Server .............................. SUCCESS [ 19.575 s]
[INFO] Apache HBase - MapReduce ........................... SUCCESS [ 11.442 s]
[INFO] Apache HBase - Testing Util ........................ SUCCESS [ 13.362 s]
[INFO] Apache HBase - Thrift .............................. SUCCESS [ 14.274 s]
[INFO] Apache HBase - RSGroup ............................. SUCCESS [ 10.479 s]
[INFO] Apache HBase - Shell ............................... SUCCESS [ 13.807 s]
[INFO] Apache HBase - Coprocessor Endpoint ................ SUCCESS [ 11.806 s]
[INFO] Apache HBase - Integration Tests ................... SUCCESS [ 13.822 s]
[INFO] Apache HBase - Rest ................................ SUCCESS [ 10.202 s]
[INFO] Apache HBase - Examples ............................ SUCCESS [ 14.512 s]
[INFO] Apache HBase - Shaded .............................. SUCCESS [ 0.385 s]
[INFO] Apache HBase - Shaded - Client (with Hadoop bundled) SUCCESS [ 31.657 s]
[INFO] Apache HBase - Shaded - Client ..................... SUCCESS [ 18.075 s]
[INFO] Apache HBase - Shaded - MapReduce .................. SUCCESS [ 28.745 s]
[INFO] Apache HBase - External Block Cache ................ SUCCESS [ 9.085 s]
[INFO] Apache HBase - HBTop ............................... SUCCESS [ 7.789 s]
[INFO] Apache HBase - Assembly ............................ SUCCESS [02:21 min]
[INFO] Apache HBase - Shaded - Testing Util ............... SUCCESS [01:08 min]
[INFO] Apache HBase - Shaded - Testing Util Tester ........ SUCCESS [ 10.186 s]
[INFO] Apache HBase Shaded Packaging Invariants ........... SUCCESS [ 11.475 s]
[INFO] Apache HBase Shaded Packaging Invariants (with Hadoop bundled) SUCCESS [ 7.806 s]
[INFO] Apache HBase - Archetypes .......................... SUCCESS [ 0.088 s]
[INFO] Apache HBase - Exemplar for hbase-client archetype . SUCCESS [ 8.892 s]
[INFO] Apache HBase - Exemplar for hbase-shaded-client archetype SUCCESS [ 11.462 s]
[INFO] Apache HBase - Archetype builder ................... SUCCESS [ 0.595 s]
[INFO] ------------------------------------------------------------------------
Phoenix
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Apache Phoenix 5.1.3:
[INFO]
[INFO] Apache Phoenix ..................................... SUCCESS [ 7.259 s]
[INFO] Phoenix Hbase 2.5.0 compatibility .................. SUCCESS [ 26.320 s]
[INFO] Phoenix Hbase 2.4.1 compatibility .................. SUCCESS [ 15.803 s]
[INFO] Phoenix Hbase 2.4.0 compatibility .................. SUCCESS [ 16.274 s]
[INFO] Phoenix Hbase 2.3.0 compatibility .................. SUCCESS [ 21.425 s]
[INFO] Phoenix Hbase 2.2.5 compatibility .................. SUCCESS [ 13.561 s]
[INFO] Phoenix Hbase 2.1.6 compatibility .................. SUCCESS [ 12.690 s]
[INFO] Phoenix Core ....................................... SUCCESS [ 57.483 s]
[INFO] Phoenix - Pherf .................................... SUCCESS [ 17.866 s]
[INFO] Phoenix - Tracing Web Application .................. SUCCESS [ 11.662 s]
[INFO] Phoenix Client Parent .............................. SUCCESS [ 0.046 s]
[INFO] Phoenix Client ..................................... SUCCESS [05:54 min]
[INFO] Phoenix Client Embedded ............................ SUCCESS [04:47 min]
[INFO] Phoenix Server JAR ................................. SUCCESS [ 59.698 s]
[INFO] Phoenix Assembly ................................... SUCCESS [ 22.830 s]
[INFO] ------------------------------------------------------------------------
tez
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for tez 0.10.2:
[INFO]
[INFO] tez ................................................ SUCCESS [ 5.045 s]
[INFO] hadoop-shim ........................................ SUCCESS [ 6.835 s]
[INFO] tez-api ............................................ SUCCESS [ 16.603 s]
[INFO] tez-build-tools .................................... SUCCESS [ 0.765 s]
[INFO] tez-common ......................................... SUCCESS [ 3.871 s]
[INFO] tez-runtime-internals .............................. SUCCESS [ 4.840 s]
[INFO] tez-runtime-library ................................ SUCCESS [ 9.267 s]
[INFO] tez-mapreduce ...................................... SUCCESS [ 5.369 s]
[INFO] tez-examples ....................................... SUCCESS [ 3.140 s]
[INFO] tez-dag ............................................ SUCCESS [ 12.564 s]
[INFO] tez-tests .......................................... SUCCESS [ 6.626 s]
[INFO] tez-ext-service-tests .............................. SUCCESS [ 5.906 s]
[INFO] tez-ui ............................................. SUCCESS [01:36 min]
[INFO] tez-plugins ........................................ SUCCESS [ 0.072 s]
[INFO] tez-protobuf-history-plugin ........................ SUCCESS [ 6.192 s]
[INFO] tez-yarn-timeline-history .......................... SUCCESS [ 7.515 s]
[INFO] tez-yarn-timeline-history-with-acls ................ SUCCESS [ 8.769 s]
[INFO] tez-yarn-timeline-cache-plugin ..................... SUCCESS [ 31.190 s]
[INFO] tez-yarn-timeline-history-with-fs .................. SUCCESS [ 7.541 s]
[INFO] tez-history-parser ................................. SUCCESS [ 22.187 s]
[INFO] tez-aux-services ................................... SUCCESS [ 17.969 s]
[INFO] tez-tools .......................................... SUCCESS [ 0.159 s]
[INFO] tez-perf-analyzer .................................. SUCCESS [ 0.095 s]
[INFO] tez-job-analyzer ................................... SUCCESS [ 4.962 s]
[INFO] tez-javadoc-tools .................................. SUCCESS [ 1.292 s]
[INFO] hadoop-shim-impls .................................. SUCCESS [ 0.074 s]
[INFO] hadoop-shim-2.8 .................................... SUCCESS [ 1.260 s]
[INFO] tez-dist ........................................... SUCCESS [ 51.987 s]
[INFO] Tez ................................................ SUCCESS [ 0.416 s]
[INFO] ------------------------------------------------------------------------
spark
- mvn clean package
2024-07-22 15:37:08.290 [INFO] ------------------------------------------------------------------------
2024-07-22 15:37:08.290 [INFO] Reactor Summary for Spark Project Parent POM 3.2.3:
2024-07-22 15:37:08.290 [INFO]
2024-07-22 15:37:08.292 [INFO] Spark Project Parent POM ........................... SUCCESS [ 7.644 s]
2024-07-22 15:37:08.292 [INFO] Spark Project Tags ................................. SUCCESS [ 17.558 s]
2024-07-22 15:37:08.292 [INFO] Spark Project Sketch ............................... SUCCESS [ 14.904 s]
2024-07-22 15:37:08.293 [INFO] Spark Project Local DB ............................. SUCCESS [ 4.335 s]
2024-07-22 15:37:08.293 [INFO] Spark Project Networking ........................... SUCCESS [ 9.101 s]
2024-07-22 15:37:08.293 [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 4.797 s]
2024-07-22 15:37:08.294 [INFO] Spark Project Unsafe ............................... SUCCESS [ 18.140 s]
2024-07-22 15:37:08.294 [INFO] Spark Project Launcher ............................. SUCCESS [ 3.284 s]
2024-07-22 15:37:08.294 [INFO] Spark Project Core ................................. SUCCESS [06:30 min] 6minutes
2024-07-22 15:37:08.294 [INFO] Spark Project ML Local Library ..................... SUCCESS [01:39 min]
2024-07-22 15:37:08.295 [INFO] Spark Project GraphX ............................... SUCCESS [01:31 min]
2024-07-22 15:37:08.295 [INFO] Spark Project Streaming ............................ SUCCESS [03:09 min] 3minutes
2024-07-22 15:37:08.295 [INFO] Spark Project Catalyst ............................. SUCCESS [07:35 min] 7minutes
2024-07-22 15:37:08.296 [INFO] Spark Project SQL .................................. SUCCESS [12:56 min] 12minutes
2024-07-22 15:37:08.296 [INFO] Spark Project ML Library ........................... SUCCESS [08:11 min] 8minutes
2024-07-22 15:37:08.296 [INFO] Spark Project Tools ................................ SUCCESS [ 18.675 s]
2024-07-22 15:37:08.296 [INFO] Spark Project Hive ................................. SUCCESS [05:02 min] 5minutes
2024-07-22 15:37:08.296 [INFO] Spark Project REPL ................................. SUCCESS [01:13 min]
2024-07-22 15:37:08.297 [INFO] Spark Project YARN Shuffle Service ................. SUCCESS [ 21.930 s]
2024-07-22 15:37:08.297 [INFO] Spark Project YARN ................................. SUCCESS [02:53 min]
2024-07-22 15:37:08.297 [INFO] Spark Project Kubernetes ........................... SUCCESS [02:39 min]
2024-07-22 15:37:08.297 [INFO] Spark Project Hive Thrift Server ................... SUCCESS [02:12 min]
2024-07-22 15:37:08.298 [INFO] Spark Project Assembly ............................. SUCCESS [ 6.690 s]
2024-07-22 15:37:08.298 [INFO] Kafka 0.10+ Token Provider for Streaming ........... SUCCESS [ 59.635 s]
2024-07-22 15:37:08.298 [INFO] Spark Integration for Kafka 0.10 ................... SUCCESS [01:28 min]
2024-07-22 15:37:08.299 [INFO] Kafka 0.10+ Source for Structured Streaming ........ SUCCESS [02:50 min]
2024-07-22 15:37:08.299 [INFO] Spark Project Examples ............................. SUCCESS [01:51 min]
2024-07-22 15:37:08.299 [INFO] Spark Integration for Kafka 0.10 Assembly .......... SUCCESS [ 19.932 s]
2024-07-22 15:37:08.300 [INFO] Spark Avro ......................................... SUCCESS [02:07 min]
2024-07-22 15:37:08.300 [INFO] ------------------------------------------------------------------------
2024-07-22 15:37:08.301 [INFO] BUILD SUCCESS
2024-07-22 15:37:08.301 [INFO] ------------------------------------------------------------------------
2024-07-22 15:37:08.302 [INFO] Total time: 01:07 h
2024-07-22 15:37:08.302 [INFO] Finished at: 2024-07-22T15:37:08+08:00
2024-07-22 15:37:08.303 [INFO] ------------------------------------------------------------------------
question:At home.,Slower compilation,CPUConsumption is also lower,Guess it could be because it wasn't turned off
2024-07-22 07:21:01.933 [INFO] ------------------------------------------------------------------------
2024-07-22 07:21:01.934 [INFO] Reactor Summary for Spark Project Parent POM 3.2.3:
2024-07-22 07:21:01.936 [INFO]
2024-07-22 07:21:01.939 [INFO] Spark Project Parent POM ........................... SUCCESS [ 24.693 s]
2024-07-22 07:21:01.941 [INFO] Spark Project Tags ................................. SUCCESS [ 47.200 s]
2024-07-22 07:21:01.943 [INFO] Spark Project Sketch ............................... SUCCESS [ 53.843 s]
2024-07-22 07:21:01.945 [INFO] Spark Project Local DB ............................. SUCCESS [ 15.175 s]
2024-07-22 07:21:01.947 [INFO] Spark Project Networking ........................... SUCCESS [ 33.738 s]
2024-07-22 07:21:01.949 [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 17.500 s]
2024-07-22 07:21:01.951 [INFO] Spark Project Unsafe ............................... SUCCESS [01:06 min]
2024-07-22 07:21:01.952 [INFO] Spark Project Launcher ............................. SUCCESS [ 12.097 s]
2024-07-22 07:21:01.954 [INFO] Spark Project Core ................................. SUCCESS [23:34 min]
2024-07-22 07:21:01.957 [INFO] Spark Project ML Local Library ..................... SUCCESS [04:01 min]
2024-07-22 07:21:01.960 [INFO] Spark Project GraphX ............................... SUCCESS [04:43 min]
2024-07-22 07:21:01.962 [INFO] Spark Project Streaming ............................ SUCCESS [08:30 min]
2024-07-22 07:21:01.962 [INFO] Spark Project Catalyst ............................. SUCCESS [24:34 min]
2024-07-22 07:21:01.963 [INFO] Spark Project SQL .................................. SUCCESS [38:07 min]
2024-07-22 07:21:01.965 [INFO] Spark Project ML Library ........................... SUCCESS [25:03 min]
2024-07-22 07:21:01.966 [INFO] Spark Project Tools ................................ SUCCESS [01:09 min]
2024-07-22 07:21:01.969 [INFO] Spark Project Hive ................................. SUCCESS [15:42 min]
2024-07-22 07:21:01.972 [INFO] Spark Project REPL ................................. SUCCESS [03:50 min]
2024-07-22 07:21:01.973 [INFO] Spark Project YARN Shuffle Service ................. SUCCESS [01:20 min]
2024-07-22 07:21:01.975 [INFO] Spark Project YARN ................................. SUCCESS [08:42 min]
2024-07-22 07:21:01.976 [INFO] Spark Project Hive Thrift Server ................... SUCCESS [08:33 min]
2024-07-22 07:21:01.976 [INFO] Spark Project Assembly ............................. SUCCESS [ 36.134 s]
2024-07-22 07:21:01.978 [INFO] Kafka 0.10+ Token Provider for Streaming ........... SUCCESS [03:52 min]
2024-07-22 07:21:01.981 [INFO] Spark Integration for Kafka 0.10 ................... SUCCESS [05:26 min]
2024-07-22 07:21:01.982 [INFO] Kafka 0.10+ Source for Structured Streaming ........ SUCCESS [10:05 min]
2024-07-22 07:21:01.983 [INFO] Spark Project Examples ............................. SUCCESS [06:48 min]
2024-07-22 07:21:01.984 [INFO] Spark Integration for Kafka 0.10 Assembly .......... SUCCESS [01:18 min]
2024-07-22 07:21:01.985 [INFO] Spark Avro ......................................... SUCCESS [08:12 min]
2024-07-22 07:21:01.987 [INFO] ------------------------------------------------------------------------
2024-07-22 07:21:01.987 [INFO] BUILD SUCCESS
2024-07-22 07:21:01.988 [INFO] ------------------------------------------------------------------------
2024-07-22 07:21:01.991 [INFO] Total time: 03:28 h
2024-07-22 07:21:01.994 [INFO] Finished at: 2024-07-22T07:21:01+08:00
2024-07-22 07:21:01.995 [INFO] ------------------------------------------------------------------------
- mvn install
2024-07-22 16:43:49.807 [INFO] ------------------------------------------------------------------------
2024-07-22 16:43:49.808 [INFO] Reactor Summary for Spark Project Parent POM 3.2.3:
2024-07-22 16:43:49.808 [INFO]
2024-07-22 16:43:49.809 [INFO] Spark Project Parent POM ........................... SUCCESS [ 8.358 s]
2024-07-22 16:43:49.809 [INFO] Spark Project Tags ................................. SUCCESS [ 11.288 s]
2024-07-22 16:43:49.810 [INFO] Spark Project Sketch ............................... SUCCESS [ 7.351 s]
2024-07-22 16:43:49.810 [INFO] Spark Project Local DB ............................. SUCCESS [ 9.763 s]
2024-07-22 16:43:49.810 [INFO] Spark Project Networking ........................... SUCCESS [ 19.142 s]
2024-07-22 16:43:49.810 [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 19.644 s]
2024-07-22 16:43:49.811 [INFO] Spark Project Unsafe ............................... SUCCESS [ 9.358 s]
2024-07-22 16:43:49.811 [INFO] Spark Project Launcher ............................. SUCCESS [ 8.332 s]
2024-07-22 16:43:49.811 [INFO] Spark Project Core ................................. SUCCESS [06:43 min]
2024-07-22 16:43:49.811 [INFO] Spark Project ML Local Library ..................... SUCCESS [01:14 min]
2024-07-22 16:43:49.811 [INFO] Spark Project GraphX ............................... SUCCESS [01:23 min]
2024-07-22 16:43:49.812 [INFO] Spark Project Streaming ............................ SUCCESS [02:23 min]
2024-07-22 16:43:49.812 [INFO] Spark Project Catalyst ............................. SUCCESS [07:44 min]
2024-07-22 16:43:49.812 [INFO] Spark Project SQL .................................. SUCCESS [12:42 min]
2024-07-22 16:43:49.812 [INFO] Spark Project ML Library ........................... SUCCESS [08:15 min]
2024-07-22 16:43:49.812 [INFO] Spark Project Tools ................................ SUCCESS [ 15.477 s]
2024-07-22 16:43:49.813 [INFO] Spark Project Hive ................................. SUCCESS [04:28 min]
2024-07-22 16:43:49.813 [INFO] Spark Project REPL ................................. SUCCESS [01:13 min]
2024-07-22 16:43:49.814 [INFO] Spark Project YARN Shuffle Service ................. SUCCESS [ 26.235 s]
2024-07-22 16:43:49.814 [INFO] Spark Project YARN ................................. SUCCESS [02:35 min]
2024-07-22 16:43:49.815 [INFO] Spark Project Kubernetes ........................... SUCCESS [02:32 min]
2024-07-22 16:43:49.815 [INFO] Spark Project Hive Thrift Server ................... SUCCESS [02:35 min]
2024-07-22 16:43:49.815 [INFO] Spark Project Assembly ............................. SUCCESS [ 5.722 s]
2024-07-22 16:43:49.816 [INFO] Kafka 0.10+ Token Provider for Streaming ........... SUCCESS [ 59.883 s]
2024-07-22 16:43:49.816 [INFO] Spark Integration for Kafka 0.10 ................... SUCCESS [01:30 min]
2024-07-22 16:43:49.816 [INFO] Kafka 0.10+ Source for Structured Streaming ........ SUCCESS [02:57 min]
2024-07-22 16:43:49.817 [INFO] Spark Project Examples ............................. SUCCESS [02:22 min]
2024-07-22 16:43:49.817 [INFO] Spark Integration for Kafka 0.10 Assembly .......... SUCCESS [ 17.156 s]
2024-07-22 16:43:49.817 [INFO] Spark Avro ......................................... SUCCESS [02:15 min]
2024-07-22 16:43:49.818 [INFO] ------------------------------------------------------------------------
2024-07-22 16:43:49.818 [INFO] BUILD SUCCESS
2024-07-22 16:43:49.818 [INFO] ------------------------------------------------------------------------
2024-07-22 16:43:49.818 [INFO] Total time: 01:06 h
2024-07-22 16:43:49.819 [INFO] Finished at: 2024-07-22T16:43:49+08:00
2024-07-22 16:43:49.819 [INFO] ------------------------------------------------------------------------
flink
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO]
[INFO] Flink : [pom]
[INFO] Flink : Annotations [jar]
[INFO] Flink : Architecture Tests [pom]
[INFO] Flink : Architecture Tests : Base [jar]
[INFO] Flink : Test utils : [pom]
[INFO] Flink : Test utils : Junit [jar]
[INFO] Flink : Metrics : [pom]
[INFO] Flink : Metrics : Core [jar]
[INFO] Flink : Core [jar]
[INFO] Flink : Table : [pom]
[INFO] Flink : Table : Common [jar]
[INFO] Flink : Table : API Java [jar]
[INFO] Flink : Java [jar]
[INFO] Flink : Connectors : [pom]
[INFO] Flink : Connectors : File Sink Common [jar]
[INFO] Flink : RPC : [pom]
[INFO] Flink : RPC : Core [jar]
[INFO] Flink : RPC : Akka [jar]
[INFO] Flink : RPC : Akka-Loader [jar]
[INFO] Flink : Queryable state : [pom]
[INFO] Flink : Queryable state : Client Java [jar]
[INFO] Flink : FileSystems : [pom]
[INFO] Flink : FileSystems : Hadoop FS [jar]
[INFO] Flink : Runtime [jar]
[INFO] Flink : Streaming Java [jar]
[INFO] Flink : Table : API bridge base [jar]
[INFO] Flink : Table : API Java bridge [jar]
[INFO] Flink : Table : Code Splitter [jar]
[INFO] Flink : Optimizer [jar]
[INFO] Flink : Clients [jar]
[INFO] Flink : DSTL [pom]
[INFO] Flink : DSTL : DFS [jar]
[INFO] Flink : State backends : [pom]
[INFO] Flink : State backends : RocksDB [jar]
[INFO] Flink : State backends : Changelog [jar]
[INFO] Flink : Test utils : Utils [jar]
[INFO] Flink : Libraries : [pom]
[INFO] Flink : Libraries : CEP [jar]
[INFO] Flink : Table : Runtime [jar]
[INFO] Flink : Scala [jar]
[INFO] Flink : Table : SQL Parser [jar]
[INFO] Flink : Table : SQL Parser Hive [jar]
[INFO] Flink : Table : API Scala [jar]
[INFO] Flink : Test utils : Connectors [jar]
[INFO] Flink : Architecture Tests : Test [jar]
[INFO] Flink : Connectors : Base [jar]
[INFO] Flink : Connectors : Files [jar]
[INFO] Flink : Examples : [pom]
[INFO] Flink : Examples : Batch [jar]
[INFO] Flink : Connectors : Hadoop compatibility [jar]
[INFO] Flink : Tests [jar]
[INFO] Flink : Streaming Scala [jar]
[INFO] Flink : Table : API Scala bridge [jar]
[INFO] Flink : Table : Planner [jar]
[INFO] Flink : Formats : [pom]
[INFO] Flink : Format : Common [jar]
[INFO] Flink : Formats : Csv [jar]
[INFO] Flink : Formats : Hadoop bulk [jar]
[INFO] Flink : Formats : Orc [jar]
[INFO] Flink : Formats : Orc nohive [jar]
[INFO] Flink : Formats : Avro [jar]
[INFO] Flink : Formats : Parquet [jar]
[INFO] Flink : Connectors : Hive [jar]
[INFO] Flink : Python [jar]
[INFO] Flink : Table : SQL Client [jar]
[INFO] Flink : Connectors : AWS Base [jar]
[INFO] Flink : Connectors : Cassandra [jar]
[INFO] Flink : Formats : Json [jar]
[INFO] Flink : Connectors : Elasticsearch base [jar]
[INFO] Flink : Connectors : Elasticsearch 6 [jar]
[INFO] Flink : Connectors : Elasticsearch 7 [jar]
[INFO] Flink : Connectors : Google PubSub [jar]
[INFO] Flink : Connectors : HBase base [jar]
[INFO] Flink : Connectors : HBase 1.4 [jar]
[INFO] Flink : Connectors : HBase 2.2 [jar]
[INFO] Flink : Connectors : JDBC [jar]
[INFO] Flink : Metrics : JMX [jar]
[INFO] Flink : Formats : Avro confluent registry [jar]
[INFO] Flink : Connectors : Kafka [jar]
[INFO] Flink : Connectors : Amazon Kinesis Data Streams [jar]
[INFO] Flink : Connectors : Kinesis [jar]
[INFO] Flink : Connectors : Nifi [jar]
[INFO] Flink : Connectors : Pulsar [jar]
[INFO] Flink : Connectors : RabbitMQ [jar]
[INFO] Flink : Architecture Tests : Production [jar]
[INFO] Flink : FileSystems : Hadoop FS shaded [jar]
[INFO] Flink : FileSystems : S3 FS Base [jar]
[INFO] Flink : FileSystems : S3 FS Hadoop [jar]
[INFO] Flink : FileSystems : S3 FS Presto [jar]
[INFO] Flink : FileSystems : OSS FS [jar]
[INFO] Flink : FileSystems : Azure FS Hadoop [jar]
[INFO] Flink : FileSystems : Google Storage FS Hadoop [jar]
[INFO] Flink : Runtime web [jar]
[INFO] Flink : Connectors : HCatalog [jar]
[INFO] Flink : Connectors : Amazon Kinesis Data Firehose [jar]
[INFO] Flink : Connectors : SQL : Elasticsearch 6 [jar]
[INFO] Flink : Connectors : SQL : Elasticsearch 7 [jar]
[INFO] Flink : Connectors : SQL : HBase 1.4 [jar]
[INFO] Flink : Connectors : SQL : HBase 2.2 [jar]
[INFO] Flink : Connectors : SQL : Hive 1.2.2 [jar]
[INFO] Flink : Connectors : SQL : Hive 2.2.0 [jar]
[INFO] Flink : Connectors : SQL : Hive 2.3.6 [jar]
[INFO] Flink : Connectors : SQL : Hive 3.1.2 [jar]
[INFO] Flink : Connectors : SQL : Kafka [jar]
[INFO] Flink : Connectors : SQL : Amazon Kinesis Data Streams [jar]
[INFO] Flink : Connectors : SQL : Amazon Kinesis Data Firehose [jar]
[INFO] Flink : Connectors : SQL : Kinesis [jar]
[INFO] Flink : Connectors : SQL : Pulsar [jar]
[INFO] Flink : Connectors : SQL : RabbitMQ [jar]
[INFO] Flink : Formats : Sequence file [jar]
[INFO] Flink : Formats : Compress [jar]
[INFO] Flink : Formats : Avro AWS Glue Schema Registry [jar]
[INFO] Flink : Formats : JSON AWS Glue Schema Registry [jar]
[INFO] Flink : Formats : SQL Orc [jar]
[INFO] Flink : Formats : SQL Parquet [jar]
[INFO] Flink : Formats : SQL Avro [jar]
[INFO] Flink : Formats : SQL Avro Confluent Registry [jar]
[INFO] Flink : Examples : Streaming [jar]
[INFO] Flink : Examples : Table [jar]
[INFO] Flink : Examples : Build Helper : [pom]
[INFO] Flink : Examples : Build Helper : Streaming State machine [jar]
[INFO] Flink : Examples : Build Helper : Streaming Google PubSub [jar]
[INFO] Flink : Container [jar]
[INFO] Flink : Queryable state : Runtime [jar]
[INFO] Flink : Dist-Scala [jar]
[INFO] Flink : Kubernetes [jar]
[INFO] Flink : Yarn [jar]
[INFO] Flink : Table : API Java Uber [jar]
[INFO] Flink : Table : Planner Loader Bundle [jar]
[INFO] Flink : Table : Planner Loader [jar]
[INFO] Flink : Libraries : Gelly [jar]
[INFO] Flink : Libraries : Gelly scala [jar]
[INFO] Flink : Libraries : Gelly Examples [jar]
[INFO] Flink : External resources : [pom]
[INFO] Flink : External resources : GPU [jar]
[INFO] Flink : Metrics : Dropwizard [jar]
[INFO] Flink : Metrics : Graphite [jar]
[INFO] Flink : Metrics : InfluxDB [jar]
[INFO] Flink : Metrics : Prometheus [jar]
[INFO] Flink : Metrics : StatsD [jar]
[INFO] Flink : Metrics : Datadog [jar]
[INFO] Flink : Metrics : Slf4j [jar]
[INFO] Flink : Libraries : CEP Scala [jar]
[INFO] Flink : Libraries : State processor API [jar]
[INFO] Flink : Dist [jar]
[INFO] Flink : Yarn Tests [jar]
[INFO] Flink : E2E Tests : [pom]
[INFO] Flink : E2E Tests : CLI [jar]
[INFO] Flink : E2E Tests : Parent Child classloading program [jar]
[INFO] Flink : E2E Tests : Parent Child classloading lib-package [jar]
[INFO] Flink : E2E Tests : Dataset allround [jar]
[INFO] Flink : E2E Tests : Dataset Fine-grained recovery [jar]
[INFO] Flink : E2E Tests : Datastream allround [jar]
[INFO] Flink : E2E Tests : Batch SQL [jar]
[INFO] Flink : E2E Tests : Stream SQL [jar]
[INFO] Flink : E2E Tests : Distributed cache via blob [jar]
[INFO] Flink : E2E Tests : High parallelism iterations [jar]
[INFO] Flink : E2E Tests : Stream stateful job upgrade [jar]
[INFO] Flink : E2E Tests : Queryable state [jar]
[INFO] Flink : E2E Tests : Local recovery and allocation [jar]
[INFO] Flink : E2E Tests : Elasticsearch 6 [jar]
[INFO] Flink : Quickstart : [pom]
[INFO] Flink : Quickstart : Java [maven-archetype]
[INFO] Flink : Quickstart : Scala [maven-archetype]
[INFO] Flink : E2E Tests : Quickstart [jar]
[INFO] Flink : E2E Tests : Confluent schema registry [jar]
[INFO] Flink : E2E Tests : Stream state TTL [jar]
[INFO] Flink : E2E Tests : SQL client [jar]
[INFO] Flink : E2E Tests : File sink [jar]
[INFO] Flink : E2E Tests : State evolution [jar]
[INFO] Flink : E2E Tests : RocksDB state memory control [jar]
[INFO] Flink : E2E Tests : Common [jar]
[INFO] Flink : E2E Tests : Metrics availability [jar]
[INFO] Flink : E2E Tests : Metrics reporter prometheus [jar]
[INFO] Flink : E2E Tests : Heavy deployment [jar]
[INFO] Flink : E2E Tests : Connectors : Google PubSub [jar]
[INFO] Flink : E2E Tests : Streaming Kafka base [jar]
[INFO] Flink : E2E Tests : Streaming Kafka [jar]
[INFO] Flink : E2E Tests : Plugins : [pom]
[INFO] Flink : E2E Tests : Plugins : Dummy fs [jar]
[INFO] Flink : E2E Tests : Plugins : Another dummy fs [jar]
[INFO] Flink : E2E Tests : TPCH [jar]
[INFO] Flink : E2E Tests : Streaming Kinesis [jar]
[INFO] Flink : E2E Tests : Elasticsearch 7 [jar]
[INFO] Flink : E2E Tests : Common Kafka [jar]
[INFO] Flink : E2E Tests : TPCDS [jar]
[INFO] Flink : E2E Tests : Netty shuffle memory control [jar]
[INFO] Flink : E2E Tests : Python [jar]
[INFO] Flink : E2E Tests : HBase [jar]
[INFO] Flink : E2E Tests : Pulsar [jar]
[INFO] Flink : E2E Tests : Avro AWS Glue Schema Registry [jar]
[INFO] Flink : E2E Tests : JSON AWS Glue Schema Registry [jar]
[INFO] Flink : E2E Tests : Scala [jar]
[INFO] Flink : E2E Tests : Kinesis SQL tests [jar]
[INFO] Flink : E2E Tests : Kinesis Firehose SQL tests [jar]
[INFO] Flink : E2E Tests : SQL [jar]
[INFO] Flink : State backends : Heap spillable [jar]
[INFO] Flink : Table : Test Utils [jar]
[INFO] Flink : Contrib : [pom]
[INFO] Flink : Contrib : Connectors : Wikiedits [jar]
[INFO] Flink : FileSystems : Tests [jar]
[INFO] Flink : Docs [jar]
[INFO] Flink : Walkthrough : [pom]
[INFO] Flink : Walkthrough : Common [jar]
[INFO] Flink : Walkthrough : Datastream Java [maven-archetype]
[INFO] Flink : Walkthrough : Datastream Scala [maven-archetype]
[INFO] Flink : Tools : CI : Java [jar]
[INFO]
[INFO] -------------------< :flink-parent >--------------------
[INFO] Flink : ............................................ SUCCESS [ 4.064 s]
[INFO] Flink : Annotations ................................ SUCCESS [ 6.429 s]
[INFO] Flink : Architecture Tests ......................... SUCCESS [ 0.333 s]
[INFO] Flink : Architecture Tests : Base .................. SUCCESS [ 1.564 s]
[INFO] Flink : Test utils : ............................... SUCCESS [ 0.271 s]
[INFO] Flink : Test utils : Junit ......................... SUCCESS [ 4.591 s]
[INFO] Flink : Metrics : .................................. SUCCESS [ 0.271 s]
[INFO] Flink : Metrics : Core ............................. SUCCESS [ 2.729 s]
[INFO] Flink : Core ....................................... SUCCESS [ 58.288 s]
[INFO] Flink : Table : .................................... SUCCESS [ 0.241 s]
[INFO] Flink : Table : Common ............................. SUCCESS [ 18.151 s]
[INFO] Flink : Table : API Java ........................... SUCCESS [ 8.986 s]
[INFO] Flink : Java ....................................... SUCCESS [ 11.097 s]
[INFO] Flink : Connectors : ............................... SUCCESS [ 0.222 s]
[INFO] Flink : Connectors : File Sink Common .............. SUCCESS [ 1.121 s]
[INFO] Flink : RPC : ...................................... SUCCESS [ 0.306 s]
[INFO] Flink : RPC : Core ................................. SUCCESS [ 1.293 s]
[INFO] Flink : RPC : Akka ................................. SUCCESS [ 16.250 s]
[INFO] Flink : RPC : Akka-Loader .......................... SUCCESS [ 4.921 s]
[INFO] Flink : Queryable state : .......................... SUCCESS [ 0.227 s]
[INFO] Flink : Queryable state : Client Java .............. SUCCESS [ 1.497 s]
[INFO] Flink : FileSystems : .............................. SUCCESS [ 0.189 s]
[INFO] Flink : FileSystems : Hadoop FS .................... SUCCESS [ 6.945 s]
[INFO] Flink : Runtime .................................... SUCCESS [01:38 min]
[INFO] Flink : Streaming Java ............................. SUCCESS [ 30.549 s]
[INFO] Flink : Table : API bridge base .................... SUCCESS [ 0.893 s]
[INFO] Flink : Table : API Java bridge .................... SUCCESS [ 2.253 s]
[INFO] Flink : Table : Code Splitter ...................... SUCCESS [ 4.346 s]
[INFO] Flink : Optimizer .................................. SUCCESS [ 8.160 s]
[INFO] Flink : Clients .................................... SUCCESS [ 4.826 s]
[INFO] Flink : DSTL ....................................... SUCCESS [ 0.247 s]
[INFO] Flink : DSTL : DFS ................................. SUCCESS [ 1.941 s]
[INFO] Flink : State backends : ........................... SUCCESS [ 0.202 s]
[INFO] Flink : State backends : RocksDB ................... SUCCESS [ 4.197 s]
[INFO] Flink : State backends : Changelog ................. SUCCESS [ 1.939 s]
[INFO] Flink : Test utils : Utils ......................... SUCCESS [ 3.685 s]
[INFO] Flink : Libraries : ................................ SUCCESS [ 0.208 s]
[INFO] Flink : Libraries : CEP ............................ SUCCESS [ 6.168 s]
[INFO] Flink : Table : Runtime ............................ SUCCESS [ 16.743 s]
[INFO] Flink : Scala ...................................... SUCCESS [01:28 min]
[INFO] Flink : Table : SQL Parser ......................... SUCCESS [ 10.096 s]
[INFO] Flink : Table : SQL Parser Hive .................... SUCCESS [ 7.248 s]
[INFO] Flink : Table : API Scala .......................... SUCCESS [ 22.686 s]
[INFO] Flink : Test utils : Connectors .................... SUCCESS [ 2.327 s]
[INFO] Flink : Architecture Tests : Test .................. SUCCESS [ 1.135 s]
[INFO] Flink : Connectors : Base .......................... SUCCESS [ 3.811 s]
[INFO] Flink : Connectors : Files ......................... SUCCESS [ 5.432 s]
[INFO] Flink : Examples : ................................. SUCCESS [ 0.301 s]
[INFO] Flink : Examples : Batch ........................... SUCCESS [ 22.082 s]
[INFO] Flink : Connectors : Hadoop compatibility .......... SUCCESS [ 12.312 s]
[INFO] Flink : Tests ...................................... SUCCESS [01:11 min]
[INFO] Flink : Streaming Scala ............................ SUCCESS [01:03 min]
[INFO] Flink : Table : API Scala bridge ................... SUCCESS [ 21.680 s]
[INFO] Flink : Table : Planner ............................ SUCCESS [07:30 min] 8minutes
[INFO] Flink : Formats : .................................. SUCCESS [ 0.258 s]
[INFO] Flink : Format : Common ............................ SUCCESS [ 0.341 s]
[INFO] Flink : Formats : Csv .............................. SUCCESS [ 2.072 s]
[INFO] Flink : Formats : Hadoop bulk ...................... SUCCESS [ 2.403 s]
[INFO] Flink : Formats : Orc .............................. SUCCESS [ 3.152 s]
[INFO] Flink : Formats : Orc nohive ....................... SUCCESS [ 2.388 s]
[INFO] Flink : Formats : Avro ............................. SUCCESS [ 6.456 s]
[INFO] Flink : Formats : Parquet .......................... SUCCESS [ 15.684 s]
[INFO] Flink : Connectors : Hive .......................... SUCCESS [ 34.813 s]
[INFO] Flink : Python ..................................... SUCCESS [01:08 min]
[INFO] Flink : Table : SQL Client ......................... SUCCESS [ 4.035 s]
[INFO] Flink : Connectors : AWS Base ...................... SUCCESS [ 2.094 s]
[INFO] Flink : Connectors : Cassandra ..................... SUCCESS [ 7.046 s]
[INFO] Flink : Formats : Json ............................. SUCCESS [ 2.567 s]
[INFO] Flink : Connectors : Elasticsearch base ............ SUCCESS [ 3.339 s]
[INFO] Flink : Connectors : Elasticsearch 6 ............... SUCCESS [ 2.120 s]
[INFO] Flink : Connectors : Elasticsearch 7 ............... SUCCESS [ 2.034 s]
[INFO] Flink : Connectors : Google PubSub ................. SUCCESS [ 1.555 s]
[INFO] Flink : Connectors : HBase base .................... SUCCESS [ 1.983 s]
[INFO] Flink : Connectors : HBase 1.4 ..................... SUCCESS [ 6.194 s]
[INFO] Flink : Connectors : HBase 2.2 ..................... SUCCESS [ 5.611 s]
[INFO] Flink : Connectors : JDBC .......................... SUCCESS [ 5.551 s]
[INFO] Flink : Metrics : JMX .............................. SUCCESS [ 0.784 s]
[INFO] Flink : Formats : Avro confluent registry .......... SUCCESS [ 1.002 s]
[INFO] Flink : Connectors : Kafka ......................... SUCCESS [ 15.116 s]
[INFO] Flink : Connectors : Amazon Kinesis Data Streams ... SUCCESS [ 3.460 s]
[INFO] Flink : Connectors : Kinesis ....................... SUCCESS [ 39.773 s]
[INFO] Flink : Connectors : Nifi .......................... SUCCESS [ 1.233 s]
[INFO] Flink : Connectors : Pulsar ........................ SUCCESS [ 18.759 s]
[INFO] Flink : Connectors : RabbitMQ ...................... SUCCESS [ 1.263 s]
[INFO] Flink : Architecture Tests : Production ............ SUCCESS [ 2.390 s]
[INFO] Flink : FileSystems : Hadoop FS shaded ............. SUCCESS [ 6.516 s]
[INFO] Flink : FileSystems : S3 FS Base ................... SUCCESS [ 1.879 s]
[INFO] Flink : FileSystems : S3 FS Hadoop ................. SUCCESS [ 12.980 s]
[INFO] Flink : FileSystems : S3 FS Presto ................. SUCCESS [01:35 min]
[INFO] Flink : FileSystems : OSS FS ....................... SUCCESS [ 27.004 s]
[INFO] Flink : FileSystems : Azure FS Hadoop .............. SUCCESS [ 33.770 s]
[INFO] Flink : FileSystems : Google Storage FS Hadoop ..... SUCCESS [ 40.728 s]
[INFO] Flink : Runtime web ................................ SUCCESS [01:53 min] Where compilation often goes wrong
[INFO] Flink : Connectors : HCatalog ...................... SUCCESS [ 25.040 s]
[INFO] Flink : Connectors : Amazon Kinesis Data Firehose .. SUCCESS [ 7.678 s]
[INFO] Flink : Connectors : SQL : Elasticsearch 6 ......... SUCCESS [ 18.119 s]
[INFO] Flink : Connectors : SQL : Elasticsearch 7 ......... SUCCESS [ 16.245 s]
[INFO] Flink : Connectors : SQL : HBase 1.4 ............... SUCCESS [ 12.682 s]
[INFO] Flink : Connectors : SQL : HBase 2.2 ............... SUCCESS [ 24.552 s]
[INFO] Flink : Connectors : SQL : Hive 1.2.2 .............. SUCCESS [ 25.486 s]
[INFO] Flink : Connectors : SQL : Hive 2.2.0 .............. SUCCESS [ 23.168 s]
[INFO] Flink : Connectors : SQL : Hive 2.3.6 .............. SUCCESS [ 20.671 s]
[INFO] Flink : Connectors : SQL : Hive 3.1.2 .............. SUCCESS [ 28.815 s]
[INFO] Flink : Connectors : SQL : Kafka ................... SUCCESS [ 3.212 s]
[INFO] Flink : Connectors : SQL : Amazon Kinesis Data Streams SUCCESS [ 5.737 s]
[INFO] Flink : Connectors : SQL : Amazon Kinesis Data Firehose SUCCESS [ 6.406 s]
[INFO] Flink : Connectors : SQL : Kinesis ................. SUCCESS [ 15.596 s]
[INFO] Flink : Connectors : SQL : Pulsar .................. SUCCESS [ 8.630 s]
[INFO] Flink : Connectors : SQL : RabbitMQ ................ SUCCESS [ 1.694 s]
[INFO] Flink : Formats : Sequence file .................... SUCCESS [ 3.445 s]
[INFO] Flink : Formats : Compress ......................... SUCCESS [ 3.588 s]
[INFO] Flink : Formats : Avro AWS Glue Schema Registry .... SUCCESS [ 10.882 s]
[INFO] Flink : Formats : JSON AWS Glue Schema Registry .... SUCCESS [ 6.255 s]
[INFO] Flink : Formats : SQL Orc .......................... SUCCESS [ 1.724 s]
[INFO] Flink : Formats : SQL Parquet ...................... SUCCESS [ 3.187 s]
[INFO] Flink : Formats : SQL Avro ......................... SUCCESS [ 3.286 s]
[INFO] Flink : Formats : SQL Avro Confluent Registry ...... SUCCESS [ 3.646 s]
[INFO] Flink : Examples : Streaming ....................... SUCCESS [ 37.441 s]
[INFO] Flink : Examples : Table ........................... SUCCESS [ 21.271 s]
[INFO] Flink : Examples : Build Helper : .................. SUCCESS [ 1.212 s]
[INFO] Flink : Examples : Build Helper : Streaming State machine SUCCESS [ 3.890 s]
[INFO] Flink : Examples : Build Helper : Streaming Google PubSub SUCCESS [ 11.414 s]
[INFO] Flink : Container .................................. SUCCESS [ 2.133 s]
[INFO] Flink : Queryable state : Runtime .................. SUCCESS [ 4.881 s]
[INFO] Flink : Dist-Scala ................................. SUCCESS [ 4.323 s]
[INFO] Flink : Kubernetes ................................. SUCCESS [ 24.012 s]
[INFO] Flink : Yarn ....................................... SUCCESS [ 15.309 s]
[INFO] Flink : Table : API Java Uber ...................... SUCCESS [ 7.425 s]
[INFO] Flink : Table : Planner Loader Bundle .............. SUCCESS [ 8.391 s]
[INFO] Flink : Table : Planner Loader ..................... SUCCESS [ 9.858 s]
[INFO] Flink : Libraries : Gelly .......................... SUCCESS [ 18.030 s]
[INFO] Flink : Libraries : Gelly scala .................... SUCCESS [ 43.604 s]
[INFO] Flink : Libraries : Gelly Examples ................. SUCCESS [ 28.281 s]
[INFO] Flink : External resources : ....................... SUCCESS [ 0.674 s]
[INFO] Flink : External resources : GPU ................... SUCCESS [ 1.818 s]
[INFO] Flink : Metrics : Dropwizard ....................... SUCCESS [ 1.815 s]
[INFO] Flink : Metrics : Graphite ......................... SUCCESS [ 1.660 s]
[INFO] Flink : Metrics : InfluxDB ......................... SUCCESS [ 3.359 s]
[INFO] Flink : Metrics : Prometheus ....................... SUCCESS [ 2.531 s]
[INFO] Flink : Metrics : StatsD ........................... SUCCESS [ 1.492 s]
[INFO] Flink : Metrics : Datadog .......................... SUCCESS [ 2.686 s]
[INFO] Flink : Metrics : Slf4j ............................ SUCCESS [ 1.298 s]
[INFO] Flink : Libraries : CEP Scala ...................... SUCCESS [ 29.310 s]
[INFO] Flink : Libraries : State processor API ............ SUCCESS [ 7.720 s]
[INFO] Flink : Dist ....................................... SUCCESS [ 34.103 s]
[INFO] Flink : Yarn Tests ................................. SUCCESS [ 16.673 s]
[INFO] Flink : E2E Tests : ................................ SUCCESS [ 0.682 s]
[INFO] Flink : E2E Tests : CLI ............................ SUCCESS [ 2.041 s]
[INFO] Flink : E2E Tests : Parent Child classloading program SUCCESS [ 2.058 s]
[INFO] Flink : E2E Tests : Parent Child classloading lib-package SUCCESS [ 1.393 s]
[INFO] Flink : E2E Tests : Dataset allround ............... SUCCESS [ 1.368 s]
[INFO] Flink : E2E Tests : Dataset Fine-grained recovery .. SUCCESS [ 1.900 s]
[INFO] Flink : E2E Tests : Datastream allround ............ SUCCESS [ 5.782 s]
[INFO] Flink : E2E Tests : Batch SQL ...................... SUCCESS [ 1.579 s]
[INFO] Flink : E2E Tests : Stream SQL ..................... SUCCESS [ 1.804 s]
[INFO] Flink : E2E Tests : Distributed cache via blob ..... SUCCESS [ 2.322 s]
[INFO] Flink : E2E Tests : High parallelism iterations .... SUCCESS [ 12.869 s]
[INFO] Flink : E2E Tests : Stream stateful job upgrade .... SUCCESS [ 3.227 s]
[INFO] Flink : E2E Tests : Queryable state ................ SUCCESS [ 5.336 s]
[INFO] Flink : E2E Tests : Local recovery and allocation .. SUCCESS [ 1.871 s]
[INFO] Flink : E2E Tests : Elasticsearch 6 ................ SUCCESS [ 8.550 s]
[INFO] Flink : Quickstart : ............................... SUCCESS [ 3.347 s]
[INFO] Flink : Quickstart : Java .......................... SUCCESS [ 2.686 s]
[INFO] Flink : Quickstart : Scala ......................... SUCCESS [ 1.510 s]
[INFO] Flink : E2E Tests : Quickstart ..................... SUCCESS [ 2.555 s]
[INFO] Flink : E2E Tests : Confluent schema registry ...... SUCCESS [ 5.685 s]
[INFO] Flink : E2E Tests : Stream state TTL ............... SUCCESS [ 11.581 s]
[INFO] Flink : E2E Tests : SQL client ..................... SUCCESS [ 3.902 s]
[INFO] Flink : E2E Tests : File sink ...................... SUCCESS [ 1.857 s]
[INFO] Flink : E2E Tests : State evolution ................ SUCCESS [ 3.031 s]
[INFO] Flink : E2E Tests : RocksDB state memory control ... SUCCESS [ 3.005 s]
[INFO] Flink : E2E Tests : Common ......................... SUCCESS [ 5.961 s]
[INFO] Flink : E2E Tests : Metrics availability ........... SUCCESS [ 1.442 s]
[INFO] Flink : E2E Tests : Metrics reporter prometheus .... SUCCESS [ 1.308 s]
[INFO] Flink : E2E Tests : Heavy deployment ............... SUCCESS [ 16.170 s]
[INFO] Flink : E2E Tests : Connectors : Google PubSub ..... SUCCESS [ 3.637 s]
[INFO] Flink : E2E Tests : Streaming Kafka base ........... SUCCESS [ 2.733 s]
[INFO] Flink : E2E Tests : Streaming Kafka ................ SUCCESS [ 14.646 s]
[INFO] Flink : E2E Tests : Plugins : ...................... SUCCESS [ 0.571 s]
[INFO] Flink : E2E Tests : Plugins : Dummy fs ............. SUCCESS [ 1.221 s]
[INFO] Flink : E2E Tests : Plugins : Another dummy fs ..... SUCCESS [ 1.397 s]
[INFO] Flink : E2E Tests : TPCH ........................... SUCCESS [ 4.258 s]
[INFO] Flink : E2E Tests : Streaming Kinesis .............. SUCCESS [ 39.958 s]
[INFO] Flink : E2E Tests : Elasticsearch 7 ................ SUCCESS [ 7.996 s]
[INFO] Flink : E2E Tests : Common Kafka ................... SUCCESS [ 6.187 s]
[INFO] Flink : E2E Tests : TPCDS .......................... SUCCESS [ 6.090 s]
[INFO] Flink : E2E Tests : Netty shuffle memory control ... SUCCESS [ 1.916 s]
[INFO] Flink : E2E Tests : Python ......................... SUCCESS [ 7.097 s]
[INFO] Flink : E2E Tests : HBase .......................... SUCCESS [ 8.455 s]
[INFO] Flink : E2E Tests : Pulsar ......................... SUCCESS [ 6.309 s]
[INFO] Flink : E2E Tests : Avro AWS Glue Schema Registry .. SUCCESS [ 5.409 s]
[INFO] Flink : E2E Tests : JSON AWS Glue Schema Registry .. SUCCESS [ 7.582 s]
[INFO] Flink : E2E Tests : Scala .......................... SUCCESS [ 17.373 s]
[INFO] Flink : E2E Tests : Kinesis SQL tests .............. SUCCESS [ 2.019 s]
[INFO] Flink : E2E Tests : Kinesis Firehose SQL tests ..... SUCCESS [ 2.118 s]
[INFO] Flink : E2E Tests : SQL ............................ SUCCESS [ 3.836 s]
[INFO] Flink : State backends : Heap spillable ............ SUCCESS [ 3.277 s]
[INFO] Flink : Table : Test Utils ......................... SUCCESS [ 4.389 s]
[INFO] Flink : Contrib : .................................. SUCCESS [ 0.957 s]
[INFO] Flink : Contrib : Connectors : Wikiedits ........... SUCCESS [ 3.463 s]
[INFO] Flink : FileSystems : Tests ........................ SUCCESS [ 3.867 s]
[INFO] Flink : Docs ....................................... SUCCESS [ 12.552 s]
[INFO] Flink : Walkthrough : .............................. SUCCESS [ 1.111 s]
[INFO] Flink : Walkthrough : Common ....................... SUCCESS [ 5.820 s]
[INFO] Flink : Walkthrough : Datastream Java .............. SUCCESS [ 1.744 s]
[INFO] Flink : Walkthrough : Datastream Scala ............. SUCCESS [ 1.685 s]
[INFO] Flink : Tools : CI : Java .......................... SUCCESS [ 2.370 s]
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Flink : 1.15.3:
[INFO]
[INFO] Flink : ............................................ SUCCESS [ 7.995 s]
[INFO] Flink : Annotations ................................ SUCCESS [ 8.043 s]
[INFO] Flink : Architecture Tests ......................... SUCCESS [ 0.385 s]
[INFO] Flink : Architecture Tests : Base .................. SUCCESS [ 1.766 s]
[INFO] Flink : Test utils : ............................... SUCCESS [ 0.265 s]
[INFO] Flink : Test utils : Junit ......................... SUCCESS [ 7.279 s]
[INFO] Flink : Metrics : .................................. SUCCESS [ 0.262 s]
[INFO] Flink : Metrics : Core ............................. SUCCESS [ 3.096 s]
[INFO] Flink : Core ....................................... SUCCESS [ 59.517 s]
[INFO] Flink : Table : .................................... SUCCESS [ 0.241 s]
[INFO] Flink : Table : Common ............................. SUCCESS [ 18.535 s]
[INFO] Flink : Table : API Java ........................... SUCCESS [ 8.766 s]
[INFO] Flink : Java ....................................... SUCCESS [ 10.929 s]
[INFO] Flink : Connectors : ............................... SUCCESS [ 0.243 s]
[INFO] Flink : Connectors : File Sink Common .............. SUCCESS [ 1.152 s]
[INFO] Flink : RPC : ...................................... SUCCESS [ 0.322 s]
[INFO] Flink : RPC : Core ................................. SUCCESS [ 1.329 s]
[INFO] Flink : RPC : Akka ................................. SUCCESS [ 17.470 s]
[INFO] Flink : RPC : Akka-Loader .......................... SUCCESS [ 5.843 s]
[INFO] Flink : Queryable state : .......................... SUCCESS [ 0.191 s]
[INFO] Flink : Queryable state : Client Java .............. SUCCESS [ 1.487 s]
[INFO] Flink : FileSystems : .............................. SUCCESS [ 0.273 s]
[INFO] Flink : FileSystems : Hadoop FS .................... SUCCESS [ 7.420 s]
[INFO] Flink : Runtime .................................... SUCCESS [01:57 min]
[INFO] Flink : Streaming Java ............................. SUCCESS [ 33.004 s]
[INFO] Flink : Table : API bridge base .................... SUCCESS [ 1.097 s]
[INFO] Flink : Table : API Java bridge .................... SUCCESS [ 2.483 s]
[INFO] Flink : Table : Code Splitter ...................... SUCCESS [ 4.870 s]
[INFO] Flink : Optimizer .................................. SUCCESS [ 10.957 s]
[INFO] Flink : Clients .................................... SUCCESS [ 7.801 s]
[INFO] Flink : DSTL ....................................... SUCCESS [ 0.325 s]
[INFO] Flink : DSTL : DFS ................................. SUCCESS [ 2.526 s]
[INFO] Flink : State backends : ........................... SUCCESS [ 0.458 s]
[INFO] Flink : State backends : RocksDB ................... SUCCESS [ 7.054 s]
[INFO] Flink : State backends : Changelog ................. SUCCESS [ 2.164 s]
[INFO] Flink : Test utils : Utils ......................... SUCCESS [ 4.810 s]
[INFO] Flink : Libraries : ................................ SUCCESS [ 0.210 s]
[INFO] Flink : Libraries : CEP ............................ SUCCESS [ 5.664 s]
[INFO] Flink : Table : Runtime ............................ SUCCESS [ 16.529 s]
[INFO] Flink : Scala ...................................... SUCCESS [01:32 min]
[INFO] Flink : Table : SQL Parser ......................... SUCCESS [ 9.768 s]
[INFO] Flink : Table : SQL Parser Hive .................... SUCCESS [ 6.073 s]
[INFO] Flink : Table : API Scala .......................... SUCCESS [ 22.689 s]
[INFO] Flink : Test utils : Connectors .................... SUCCESS [ 2.072 s]
[INFO] Flink : Architecture Tests : Test .................. SUCCESS [ 1.176 s]
[INFO] Flink : Connectors : Base .......................... SUCCESS [ 3.125 s]
[INFO] Flink : Connectors : Files ......................... SUCCESS [ 5.213 s]
[INFO] Flink : Examples : ................................. SUCCESS [ 0.302 s]
[INFO] Flink : Examples : Batch ........................... SUCCESS [ 21.655 s]
[INFO] Flink : Connectors : Hadoop compatibility .......... SUCCESS [ 11.743 s]
[INFO] Flink : Tests ...................................... SUCCESS [01:11 min]
[INFO] Flink : Streaming Scala ............................ SUCCESS [01:00 min]
[INFO] Flink : Table : API Scala bridge ................... SUCCESS [ 19.945 s]
[INFO] Flink : Table : Planner ............................ SUCCESS [05:36 min]
[INFO] Flink : Formats : .................................. SUCCESS [ 0.134 s]
[INFO] Flink : Format : Common ............................ SUCCESS [ 0.343 s]
[INFO] Flink : Formats : Csv .............................. SUCCESS [ 1.905 s]
[INFO] Flink : Formats : Hadoop bulk ...................... SUCCESS [ 2.757 s]
[INFO] Flink : Formats : Orc .............................. SUCCESS [ 2.996 s]
[INFO] Flink : Formats : Orc nohive ....................... SUCCESS [ 2.281 s]
[INFO] Flink : Formats : Avro ............................. SUCCESS [ 6.245 s]
[INFO] Flink : Formats : Parquet .......................... SUCCESS [ 16.343 s]
[INFO] Flink : Connectors : Hive .......................... SUCCESS [ 33.385 s]
[INFO] Flink : Python ..................................... SUCCESS [01:00 min]
[INFO] Flink : Table : SQL Client ......................... SUCCESS [ 4.377 s]
[INFO] Flink : Connectors : AWS Base ...................... SUCCESS [ 1.983 s]
[INFO] Flink : Connectors : Cassandra ..................... SUCCESS [ 6.715 s]
[INFO] Flink : Formats : Json ............................. SUCCESS [ 2.139 s]
[INFO] Flink : Connectors : Elasticsearch base ............ SUCCESS [ 3.682 s]
[INFO] Flink : Connectors : Elasticsearch 6 ............... SUCCESS [ 3.211 s]
[INFO] Flink : Connectors : Elasticsearch 7 ............... SUCCESS [ 1.792 s]
[INFO] Flink : Connectors : Google PubSub ................. SUCCESS [ 1.583 s]
[INFO] Flink : Connectors : HBase base .................... SUCCESS [ 2.133 s]
[INFO] Flink : Connectors : HBase 1.4 ..................... SUCCESS [ 6.742 s]
[INFO] Flink : Connectors : HBase 2.2 ..................... SUCCESS [ 5.883 s]
[INFO] Flink : Connectors : JDBC .......................... SUCCESS [ 4.605 s]
[INFO] Flink : Metrics : JMX .............................. SUCCESS [ 0.625 s]
[INFO] Flink : Formats : Avro confluent registry .......... SUCCESS [ 1.078 s]
[INFO] Flink : Connectors : Kafka ......................... SUCCESS [ 7.781 s]
[INFO] Flink : Connectors : Amazon Kinesis Data Streams ... SUCCESS [ 1.645 s]
[INFO] Flink : Connectors : Kinesis ....................... SUCCESS [ 28.342 s]
[INFO] Flink : Connectors : Nifi .......................... SUCCESS [ 1.144 s]
[INFO] Flink : Connectors : Pulsar ........................ SUCCESS [ 19.607 s]
[INFO] Flink : Connectors : RabbitMQ ...................... SUCCESS [ 1.307 s]
[INFO] Flink : Architecture Tests : Production ............ SUCCESS [ 2.979 s]
[INFO] Flink : FileSystems : Hadoop FS shaded ............. SUCCESS [ 6.329 s]
[INFO] Flink : FileSystems : S3 FS Base ................... SUCCESS [ 2.042 s]
[INFO] Flink : FileSystems : S3 FS Hadoop ................. SUCCESS [ 13.426 s]
[INFO] Flink : FileSystems : S3 FS Presto ................. SUCCESS [01:31 min]
[INFO] Flink : FileSystems : OSS FS ....................... SUCCESS [ 21.164 s]
[INFO] Flink : FileSystems : Azure FS Hadoop .............. SUCCESS [ 30.780 s]
[INFO] Flink : FileSystems : Google Storage FS Hadoop ..... SUCCESS [ 33.817 s]
[INFO] Flink : Runtime web ................................ FAILURE [ 7.514 s]
[INFO] Flink : Connectors : HCatalog ...................... SKIPPED
[INFO] Flink : Connectors : Amazon Kinesis Data Firehose .. SKIPPED
[INFO] Flink : Connectors : SQL : Elasticsearch 6 ......... SKIPPED
[INFO] Flink : Connectors : SQL : Elasticsearch 7 ......... SKIPPED
[INFO] Flink : Connectors : SQL : HBase 1.4 ............... SKIPPED
[INFO] Flink : Connectors : SQL : HBase 2.2 ............... SKIPPED
[INFO] Flink : Connectors : SQL : Hive 1.2.2 .............. SKIPPED
[INFO] Flink : Connectors : SQL : Hive 2.2.0 .............. SKIPPED
[INFO] Flink : Connectors : SQL : Hive 2.3.6 .............. SKIPPED
[INFO] Flink : Connectors : SQL : Hive 3.1.2 .............. SKIPPED
[INFO] Flink : Connectors : SQL : Kafka ................... SKIPPED
[INFO] Flink : Connectors : SQL : Amazon Kinesis Data Streams SKIPPED
[INFO] Flink : Connectors : SQL : Amazon Kinesis Data Firehose SKIPPED
[INFO] Flink : Connectors : SQL : Kinesis ................. SKIPPED
[INFO] Flink : Connectors : SQL : Pulsar .................. SKIPPED
[INFO] Flink : Connectors : SQL : RabbitMQ ................ SKIPPED
[INFO] Flink : Formats : Sequence file .................... SKIPPED
[INFO] Flink : Formats : Compress ......................... SKIPPED
[INFO] Flink : Formats : Avro AWS Glue Schema Registry .... SKIPPED
[INFO] Flink : Formats : JSON AWS Glue Schema Registry .... SKIPPED
[INFO] Flink : Formats : SQL Orc .......................... SKIPPED
[INFO] Flink : Formats : SQL Parquet ...................... SKIPPED
[INFO] Flink : Formats : SQL Avro ......................... SKIPPED
[INFO] Flink : Formats : SQL Avro Confluent Registry ...... SKIPPED
[INFO] Flink : Examples : Streaming ....................... SKIPPED
[INFO] Flink : Examples : Table ........................... SKIPPED
[INFO] Flink : Examples : Build Helper : .................. SKIPPED
[INFO] Flink : Examples : Build Helper : Streaming State machine SKIPPED
[INFO] Flink : Examples : Build Helper : Streaming Google PubSub SKIPPED
[INFO] Flink : Container .................................. SKIPPED
[INFO] Flink : Queryable state : Runtime .................. SKIPPED
[INFO] Flink : Dist-Scala ................................. SKIPPED
[INFO] Flink : Kubernetes ................................. SKIPPED
[INFO] Flink : Yarn ....................................... SKIPPED
[INFO] Flink : Table : API Java Uber ...................... SKIPPED
[INFO] Flink : Table : Planner Loader Bundle .............. SKIPPED
[INFO] Flink : Table : Planner Loader ..................... SKIPPED
[INFO] Flink : Libraries : Gelly .......................... SKIPPED
[INFO] Flink : Libraries : Gelly scala .................... SKIPPED
[INFO] Flink : Libraries : Gelly Examples ................. SKIPPED
[INFO] Flink : External resources : ....................... SKIPPED
[INFO] Flink : External resources : GPU ................... SKIPPED
[INFO] Flink : Metrics : Dropwizard ....................... SKIPPED
[INFO] Flink : Metrics : Graphite ......................... SKIPPED
[INFO] Flink : Metrics : InfluxDB ......................... SKIPPED
[INFO] Flink : Metrics : Prometheus ....................... SKIPPED
[INFO] Flink : Metrics : StatsD ........................... SKIPPED
[INFO] Flink : Metrics : Datadog .......................... SKIPPED
[INFO] Flink : Metrics : Slf4j ............................ SKIPPED
[INFO] Flink : Libraries : CEP Scala ...................... SKIPPED
[INFO] Flink : Libraries : State processor API ............ SKIPPED
[INFO] Flink : Dist ....................................... SKIPPED
[INFO] Flink : Yarn Tests ................................. SKIPPED
[INFO] Flink : E2E Tests : ................................ SKIPPED
[INFO] Flink : E2E Tests : CLI ............................ SKIPPED
[INFO] Flink : E2E Tests : Parent Child classloading program SKIPPED
[INFO] Flink : E2E Tests : Parent Child classloading lib-package SKIPPED
[INFO] Flink : E2E Tests : Dataset allround ............... SKIPPED
[INFO] Flink : E2E Tests : Dataset Fine-grained recovery .. SKIPPED
[INFO] Flink : E2E Tests : Datastream allround ............ SKIPPED
[INFO] Flink : E2E Tests : Batch SQL ...................... SKIPPED
[INFO] Flink : E2E Tests : Stream SQL ..................... SKIPPED
[INFO] Flink : E2E Tests : Distributed cache via blob ..... SKIPPED
[INFO] Flink : E2E Tests : High parallelism iterations .... SKIPPED
[INFO] Flink : E2E Tests : Stream stateful job upgrade .... SKIPPED
[INFO] Flink : E2E Tests : Queryable state ................ SKIPPED
[INFO] Flink : E2E Tests : Local recovery and allocation .. SKIPPED
[INFO] Flink : E2E Tests : Elasticsearch 6 ................ SKIPPED
[INFO] Flink : Quickstart : ............................... SKIPPED
[INFO] Flink : Quickstart : Java .......................... SKIPPED
[INFO] Flink : Quickstart : Scala ......................... SKIPPED
[INFO] Flink : E2E Tests : Quickstart ..................... SKIPPED
[INFO] Flink : E2E Tests : Confluent schema registry ...... SKIPPED
[INFO] Flink : E2E Tests : Stream state TTL ............... SKIPPED
[INFO] Flink : E2E Tests : SQL client ..................... SKIPPED
[INFO] Flink : E2E Tests : File sink ...................... SKIPPED
[INFO] Flink : E2E Tests : State evolution ................ SKIPPED
[INFO] Flink : E2E Tests : RocksDB state memory control ... SKIPPED
[INFO] Flink : E2E Tests : Common ......................... SKIPPED
[INFO] Flink : E2E Tests : Metrics availability ........... SKIPPED
[INFO] Flink : E2E Tests : Metrics reporter prometheus .... SKIPPED
[INFO] Flink : E2E Tests : Heavy deployment ............... SKIPPED
[INFO] Flink : E2E Tests : Connectors : Google PubSub ..... SKIPPED
[INFO] Flink : E2E Tests : Streaming Kafka base ........... SKIPPED
[INFO] Flink : E2E Tests : Streaming Kafka ................ SKIPPED
[INFO] Flink : E2E Tests : Plugins : ...................... SKIPPED
[INFO] Flink : E2E Tests : Plugins : Dummy fs ............. SKIPPED
[INFO] Flink : E2E Tests : Plugins : Another dummy fs ..... SKIPPED
[INFO] Flink : E2E Tests : TPCH ........................... SKIPPED
[INFO] Flink : E2E Tests : Streaming Kinesis .............. SKIPPED
[INFO] Flink : E2E Tests : Elasticsearch 7 ................ SKIPPED
[INFO] Flink : E2E Tests : Common Kafka ................... SKIPPED
[INFO] Flink : E2E Tests : TPCDS .......................... SKIPPED
[INFO] Flink : E2E Tests : Netty shuffle memory control ... SKIPPED
[INFO] Flink : E2E Tests : Python ......................... SKIPPED
[INFO] Flink : E2E Tests : HBase .......................... SKIPPED
[INFO] Flink : E2E Tests : Pulsar ......................... SKIPPED
[INFO] Flink : E2E Tests : Avro AWS Glue Schema Registry .. SKIPPED
[INFO] Flink : E2E Tests : JSON AWS Glue Schema Registry .. SKIPPED
[INFO] Flink : E2E Tests : Scala .......................... SKIPPED
[INFO] Flink : E2E Tests : Kinesis SQL tests .............. SKIPPED
[INFO] Flink : E2E Tests : Kinesis Firehose SQL tests ..... SKIPPED
[INFO] Flink : E2E Tests : SQL ............................ SKIPPED
[INFO] Flink : State backends : Heap spillable ............ SKIPPED
[INFO] Flink : Table : Test Utils ......................... SKIPPED
[INFO] Flink : Contrib : .................................. SKIPPED
[INFO] Flink : Contrib : Connectors : Wikiedits ........... SKIPPED
[INFO] Flink : FileSystems : Tests ........................ SKIPPED
[INFO] Flink : Docs ....................................... SKIPPED
[INFO] Flink : Walkthrough : .............................. SKIPPED
[INFO] Flink : Walkthrough : Common ....................... SKIPPED
[INFO] Flink : Walkthrough : Datastream Java .............. SKIPPED
[INFO] Flink : Walkthrough : Datastream Scala ............. SKIPPED
[INFO] Flink : Tools : CI : Java .......................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 24:55 min
[INFO] Finished at: 2024-07-23T20:20:59+08:00
[INFO] ------------------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Flink : 1.15.3:
[INFO]
[INFO] Flink : ............................................ SUCCESS [ 7.995 s]
[INFO] Flink : Annotations ................................ SUCCESS [ 8.043 s]
[INFO] Flink : Architecture Tests ......................... SUCCESS [ 0.385 s]
[INFO] Flink : Architecture Tests : Base .................. SUCCESS [ 1.766 s]
[INFO] Flink : Test utils : ............................... SUCCESS [ 0.265 s]
[INFO] Flink : Test utils : Junit ......................... SUCCESS [ 7.279 s]
[INFO] Flink : Metrics : .................................. SUCCESS [ 0.262 s]
[INFO] Flink : Metrics : Core ............................. SUCCESS [ 3.096 s]
[INFO] Flink : Core ....................................... SUCCESS [ 59.517 s]
[INFO] Flink : Table : .................................... SUCCESS [ 0.241 s]
[INFO] Flink : Table : Common ............................. SUCCESS [ 18.535 s]
[INFO] Flink : Table : API Java ........................... SUCCESS [ 8.766 s]
[INFO] Flink : Java ....................................... SUCCESS [ 10.929 s]
[INFO] Flink : Connectors : ............................... SUCCESS [ 0.243 s]
[INFO] Flink : Connectors : File Sink Common .............. SUCCESS [ 1.152 s]
[INFO] Flink : RPC : ...................................... SUCCESS [ 0.322 s]
[INFO] Flink : RPC : Core ................................. SUCCESS [ 1.329 s]
[INFO] Flink : RPC : Akka ................................. SUCCESS [ 17.470 s]
[INFO] Flink : RPC : Akka-Loader .......................... SUCCESS [ 5.843 s]
[INFO] Flink : Queryable state : .......................... SUCCESS [ 0.191 s]
[INFO] Flink : Queryable state : Client Java .............. SUCCESS [ 1.487 s]
[INFO] Flink : FileSystems : .............................. SUCCESS [ 0.273 s]
[INFO] Flink : FileSystems : Hadoop FS .................... SUCCESS [ 7.420 s]
[INFO] Flink : Runtime .................................... SUCCESS [01:57 min]
[INFO] Flink : Streaming Java ............................. SUCCESS [ 33.004 s]
[INFO] Flink : Table : API bridge base .................... SUCCESS [ 1.097 s]
[INFO] Flink : Table : API Java bridge .................... SUCCESS [ 2.483 s]
[INFO] Flink : Table : Code Splitter ...................... SUCCESS [ 4.870 s]
[INFO] Flink : Optimizer .................................. SUCCESS [ 10.957 s]
[INFO] Flink : Clients .................................... SUCCESS [ 7.801 s]
[INFO] Flink : DSTL ....................................... SUCCESS [ 0.325 s]
[INFO] Flink : DSTL : DFS ................................. SUCCESS [ 2.526 s]
[INFO] Flink : State backends : ........................... SUCCESS [ 0.458 s]
[INFO] Flink : State backends : RocksDB ................... SUCCESS [ 7.054 s]
[INFO] Flink : State backends : Changelog ................. SUCCESS [ 2.164 s]
[INFO] Flink : Test utils : Utils ......................... SUCCESS [ 4.810 s]
[INFO] Flink : Libraries : ................................ SUCCESS [ 0.210 s]
[INFO] Flink : Libraries : CEP ............................ SUCCESS [ 5.664 s]
[INFO] Flink : Table : Runtime ............................ SUCCESS [ 16.529 s]
[INFO] Flink : Scala ...................................... SUCCESS [01:32 min]
[INFO] Flink : Table : SQL Parser ......................... SUCCESS [ 9.768 s]
[INFO] Flink : Table : SQL Parser Hive .................... SUCCESS [ 6.073 s]
[INFO] Flink : Table : API Scala .......................... SUCCESS [ 22.689 s]
[INFO] Flink : Test utils : Connectors .................... SUCCESS [ 2.072 s]
[INFO] Flink : Architecture Tests : Test .................. SUCCESS [ 1.176 s]
[INFO] Flink : Connectors : Base .......................... SUCCESS [ 3.125 s]
[INFO] Flink : Connectors : Files ......................... SUCCESS [ 5.213 s]
[INFO] Flink : Examples : ................................. SUCCESS [ 0.302 s]
[INFO] Flink : Examples : Batch ........................... SUCCESS [ 21.655 s]
[INFO] Flink : Connectors : Hadoop compatibility .......... SUCCESS [ 11.743 s]
[INFO] Flink : Tests ...................................... SUCCESS [01:11 min]
[INFO] Flink : Streaming Scala ............................ SUCCESS [01:00 min]
[INFO] Flink : Table : API Scala bridge ................... SUCCESS [ 19.945 s]
[INFO] Flink : Table : Planner ............................ SUCCESS [05:36 min]
[INFO] Flink : Formats : .................................. SUCCESS [ 0.134 s]
[INFO] Flink : Format : Common ............................ SUCCESS [ 0.343 s]
[INFO] Flink : Formats : Csv .............................. SUCCESS [ 1.905 s]
[INFO] Flink : Formats : Hadoop bulk ...................... SUCCESS [ 2.757 s]
[INFO] Flink : Formats : Orc .............................. SUCCESS [ 2.996 s]
[INFO] Flink : Formats : Orc nohive ....................... SUCCESS [ 2.281 s]
[INFO] Flink : Formats : Avro ............................. SUCCESS [ 6.245 s]
[INFO] Flink : Formats : Parquet .......................... SUCCESS [ 16.343 s]
[INFO] Flink : Connectors : Hive .......................... SUCCESS [ 33.385 s]
[INFO] Flink : Python ..................................... SUCCESS [01:00 min]
[INFO] Flink : Table : SQL Client ......................... SUCCESS [ 4.377 s]
[INFO] Flink : Connectors : AWS Base ...................... SUCCESS [ 1.983 s]
[INFO] Flink : Connectors : Cassandra ..................... SUCCESS [ 6.715 s]
[INFO] Flink : Formats : Json ............................. SUCCESS [ 2.139 s]
[INFO] Flink : Connectors : Elasticsearch base ............ SUCCESS [ 3.682 s]
[INFO] Flink : Connectors : Elasticsearch 6 ............... SUCCESS [ 3.211 s]
[INFO] Flink : Connectors : Elasticsearch 7 ............... SUCCESS [ 1.792 s]
[INFO] Flink : Connectors : Google PubSub ................. SUCCESS [ 1.583 s]
[INFO] Flink : Connectors : HBase base .................... SUCCESS [ 2.133 s]
[INFO] Flink : Connectors : HBase 1.4 ..................... SUCCESS [ 6.742 s]
[INFO] Flink : Connectors : HBase 2.2 ..................... SUCCESS [ 5.883 s]
[INFO] Flink : Connectors : JDBC .......................... SUCCESS [ 4.605 s]
[INFO] Flink : Metrics : JMX .............................. SUCCESS [ 0.625 s]
[INFO] Flink : Formats : Avro confluent registry .......... SUCCESS [ 1.078 s]
[INFO] Flink : Connectors : Kafka ......................... SUCCESS [ 7.781 s]
[INFO] Flink : Connectors : Amazon Kinesis Data Streams ... SUCCESS [ 1.645 s]
[INFO] Flink : Connectors : Kinesis ....................... SUCCESS [ 28.342 s]
[INFO] Flink : Connectors : Nifi .......................... SUCCESS [ 1.144 s]
[INFO] Flink : Connectors : Pulsar ........................ SUCCESS [ 19.607 s]
[INFO] Flink : Connectors : RabbitMQ ...................... SUCCESS [ 1.307 s]
[INFO] Flink : Architecture Tests : Production ............ SUCCESS [ 2.979 s]
[INFO] Flink : FileSystems : Hadoop FS shaded ............. SUCCESS [ 6.329 s]
[INFO] Flink : FileSystems : S3 FS Base ................... SUCCESS [ 2.042 s]
[INFO] Flink : FileSystems : S3 FS Hadoop ................. SUCCESS [ 13.426 s]
[INFO] Flink : FileSystems : S3 FS Presto ................. SUCCESS [01:31 min]
[INFO] Flink : FileSystems : OSS FS ....................... SUCCESS [ 21.164 s]
[INFO] Flink : FileSystems : Azure FS Hadoop .............. SUCCESS [ 30.780 s]
[INFO] Flink : FileSystems : Google Storage FS Hadoop ..... SUCCESS [ 33.817 s]
--------------------------------------------------------------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Flink : Runtime web 1.15.3:
[INFO]
[INFO] Flink : Runtime web ................................ SUCCESS [03:43 min] (prefix indicating ordinal number, e.g. first, number two etc)93subproject
[INFO] Flink : Connectors : HCatalog ...................... SUCCESS [ 27.049 s]
[INFO] Flink : Connectors : Amazon Kinesis Data Firehose .. SUCCESS [ 7.150 s]
[INFO] Flink : Connectors : SQL : Elasticsearch 6 ......... SUCCESS [ 21.324 s]
[INFO] Flink : Connectors : SQL : Elasticsearch 7 ......... SUCCESS [ 15.506 s]
[INFO] Flink : Connectors : SQL : HBase 1.4 ............... SUCCESS [ 13.099 s]
[INFO] Flink : Connectors : SQL : HBase 2.2 ............... SUCCESS [ 22.985 s]
[INFO] Flink : Connectors : SQL : Hive 1.2.2 .............. SUCCESS [ 24.233 s]
[INFO] Flink : Connectors : SQL : Hive 2.2.0 .............. SUCCESS [ 21.787 s]
[INFO] Flink : Connectors : SQL : Hive 2.3.6 .............. SUCCESS [ 18.690 s]
[INFO] Flink : Connectors : SQL : Hive 3.1.2 .............. SUCCESS [ 28.055 s]
[INFO] Flink : Connectors : SQL : Kafka ................... SUCCESS [ 2.946 s]
[INFO] Flink : Connectors : SQL : Amazon Kinesis Data Streams SUCCESS [ 6.017 s]
[INFO] Flink : Connectors : SQL : Amazon Kinesis Data Firehose SUCCESS [ 5.719 s]
[INFO] Flink : Connectors : SQL : Kinesis ................. SUCCESS [ 15.532 s]
[INFO] Flink : Connectors : SQL : Pulsar .................. SUCCESS [ 8.219 s]
[INFO] Flink : Connectors : SQL : RabbitMQ ................ SUCCESS [ 1.337 s]
[INFO] Flink : Formats : Sequence file .................... SUCCESS [ 2.858 s]
[INFO] Flink : Formats : Compress ......................... SUCCESS [ 2.911 s]
[INFO] Flink : Formats : Avro AWS Glue Schema Registry .... SUCCESS [ 7.371 s]
[INFO] Flink : Formats : JSON AWS Glue Schema Registry .... SUCCESS [ 6.201 s]
[INFO] Flink : Formats : SQL Orc .......................... SUCCESS [ 1.209 s]
[INFO] Flink : Formats : SQL Parquet ...................... SUCCESS [ 2.485 s]
[INFO] Flink : Formats : SQL Avro ......................... SUCCESS [ 2.524 s]
[INFO] Flink : Formats : SQL Avro Confluent Registry ...... SUCCESS [ 2.992 s]
[INFO] Flink : Examples : Streaming ....................... SUCCESS [ 33.861 s]
[INFO] Flink : Examples : Table ........................... SUCCESS [ 19.095 s]
[INFO] Flink : Examples : Build Helper : .................. SUCCESS [ 1.024 s]
[INFO] Flink : Examples : Build Helper : Streaming State machine SUCCESS [ 2.983 s]
[INFO] Flink : Examples : Build Helper : Streaming Google PubSub SUCCESS [ 9.270 s]
[INFO] Flink : Container .................................. SUCCESS [ 1.725 s]
[INFO] Flink : Queryable state : Runtime .................. SUCCESS [ 4.084 s]
[INFO] Flink : Dist-Scala ................................. SUCCESS [ 3.771 s]
[INFO] Flink : Kubernetes ................................. SUCCESS [ 22.354 s]
[INFO] Flink : Yarn ....................................... SUCCESS [ 12.413 s]
[INFO] Flink : Table : API Java Uber ...................... SUCCESS [ 6.933 s]
[INFO] Flink : Table : Planner Loader Bundle .............. SUCCESS [ 7.476 s]
[INFO] Flink : Table : Planner Loader ..................... SUCCESS [ 7.382 s]
[INFO] Flink : Libraries : Gelly .......................... SUCCESS [ 16.101 s]
[INFO] Flink : Libraries : Gelly scala .................... SUCCESS [ 42.011 s]
[INFO] Flink : Libraries : Gelly Examples ................. SUCCESS [ 26.385 s]
[INFO] Flink : External resources : ....................... SUCCESS [ 0.745 s]
[INFO] Flink : External resources : GPU ................... SUCCESS [ 1.493 s]
[INFO] Flink : Metrics : Dropwizard ....................... SUCCESS [ 1.778 s]
[INFO] Flink : Metrics : Graphite ......................... SUCCESS [ 1.301 s]
[INFO] Flink : Metrics : InfluxDB ......................... SUCCESS [ 3.347 s]
[INFO] Flink : Metrics : Prometheus ....................... SUCCESS [ 2.463 s]
[INFO] Flink : Metrics : StatsD ........................... SUCCESS [ 1.234 s]
[INFO] Flink : Metrics : Datadog .......................... SUCCESS [ 1.967 s]
[INFO] Flink : Metrics : Slf4j ............................ SUCCESS [ 1.424 s]
[INFO] Flink : Libraries : CEP Scala ...................... SUCCESS [ 28.689 s]
[INFO] Flink : Libraries : State processor API ............ SUCCESS [ 8.257 s]
[INFO] Flink : Dist ....................................... SUCCESS [ 29.859 s]
[INFO] Flink : Yarn Tests ................................. SUCCESS [ 13.865 s]
[INFO] Flink : E2E Tests : ................................ SUCCESS [ 0.558 s]
[INFO] Flink : E2E Tests : CLI ............................ SUCCESS [ 1.343 s]
[INFO] Flink : E2E Tests : Parent Child classloading program SUCCESS [ 1.706 s]
[INFO] Flink : E2E Tests : Parent Child classloading lib-package SUCCESS [ 1.407 s]
[INFO] Flink : E2E Tests : Dataset allround ............... SUCCESS [ 1.325 s]
[INFO] Flink : E2E Tests : Dataset Fine-grained recovery .. SUCCESS [ 1.579 s]
[INFO] Flink : E2E Tests : Datastream allround ............ SUCCESS [ 4.573 s]
[INFO] Flink : E2E Tests : Batch SQL ...................... SUCCESS [ 1.514 s]
[INFO] Flink : E2E Tests : Stream SQL ..................... SUCCESS [ 1.850 s]
[INFO] Flink : E2E Tests : Distributed cache via blob ..... SUCCESS [ 1.870 s]
[INFO] Flink : E2E Tests : High parallelism iterations .... SUCCESS [ 12.986 s]
[INFO] Flink : E2E Tests : Stream stateful job upgrade .... SUCCESS [ 3.228 s]
[INFO] Flink : E2E Tests : Queryable state ................ SUCCESS [ 5.598 s]
[INFO] Flink : E2E Tests : Local recovery and allocation .. SUCCESS [ 1.794 s]
[INFO] Flink : E2E Tests : Elasticsearch 6 ................ SUCCESS [ 7.156 s]
[INFO] Flink : Quickstart : ............................... SUCCESS [ 2.048 s]
[INFO] Flink : Quickstart : Java .......................... SUCCESS [ 1.930 s]
[INFO] Flink : Quickstart : Scala ......................... SUCCESS [ 1.234 s]
[INFO] Flink : E2E Tests : Quickstart ..................... SUCCESS [ 3.613 s]
[INFO] Flink : E2E Tests : Confluent schema registry ...... SUCCESS [ 6.589 s]
[INFO] Flink : E2E Tests : Stream state TTL ............... SUCCESS [ 11.081 s]
[INFO] Flink : E2E Tests : SQL client ..................... SUCCESS [ 3.873 s]
[INFO] Flink : E2E Tests : File sink ...................... SUCCESS [ 1.760 s]
[INFO] Flink : E2E Tests : State evolution ................ SUCCESS [ 2.769 s]
[INFO] Flink : E2E Tests : RocksDB state memory control ... SUCCESS [ 3.049 s]
[INFO] Flink : E2E Tests : Common ......................... SUCCESS [ 5.366 s]
[INFO] Flink : E2E Tests : Metrics availability ........... SUCCESS [ 1.496 s]
[INFO] Flink : E2E Tests : Metrics reporter prometheus .... SUCCESS [ 1.555 s]
[INFO] Flink : E2E Tests : Heavy deployment ............... SUCCESS [ 15.932 s]
[INFO] Flink : E2E Tests : Connectors : Google PubSub ..... SUCCESS [ 2.823 s]
[INFO] Flink : E2E Tests : Streaming Kafka base ........... SUCCESS [ 2.473 s]
[INFO] Flink : E2E Tests : Streaming Kafka ................ SUCCESS [ 12.795 s]
[INFO] Flink : E2E Tests : Plugins : ...................... SUCCESS [ 0.654 s]
[INFO] Flink : E2E Tests : Plugins : Dummy fs ............. SUCCESS [ 1.148 s]
[INFO] Flink : E2E Tests : Plugins : Another dummy fs ..... SUCCESS [ 1.301 s]
[INFO] Flink : E2E Tests : TPCH ........................... SUCCESS [ 4.617 s]
[INFO] Flink : E2E Tests : Streaming Kinesis .............. SUCCESS [ 35.494 s]
[INFO] Flink : E2E Tests : Elasticsearch 7 ................ SUCCESS [ 8.015 s]
[INFO] Flink : E2E Tests : Common Kafka ................... SUCCESS [ 5.963 s]
[INFO] Flink : E2E Tests : TPCDS .......................... SUCCESS [ 6.296 s]
[INFO] Flink : E2E Tests : Netty shuffle memory control ... SUCCESS [ 1.781 s]
[INFO] Flink : E2E Tests : Python ......................... SUCCESS [ 6.986 s]
[INFO] Flink : E2E Tests : HBase .......................... SUCCESS [ 6.613 s]
[INFO] Flink : E2E Tests : Pulsar ......................... SUCCESS [ 5.003 s]
[INFO] Flink : E2E Tests : Avro AWS Glue Schema Registry .. SUCCESS [ 4.342 s]
[INFO] Flink : E2E Tests : JSON AWS Glue Schema Registry .. SUCCESS [ 6.271 s]
[INFO] Flink : E2E Tests : Scala .......................... SUCCESS [ 14.172 s]
[INFO] Flink : E2E Tests : Kinesis SQL tests .............. SUCCESS [ 2.021 s]
[INFO] Flink : E2E Tests : Kinesis Firehose SQL tests ..... SUCCESS [ 2.207 s]
[INFO] Flink : E2E Tests : SQL ............................ SUCCESS [ 3.718 s]
[INFO] Flink : State backends : Heap spillable ............ SUCCESS [ 3.408 s]
[INFO] Flink : Table : Test Utils ......................... SUCCESS [ 3.781 s]
[INFO] Flink : Contrib : .................................. SUCCESS [ 0.773 s]
[INFO] Flink : Contrib : Connectors : Wikiedits ........... SUCCESS [ 2.721 s]
[INFO] Flink : FileSystems : Tests ........................ SUCCESS [ 3.393 s]
[INFO] Flink : Docs ....................................... SUCCESS [ 10.543 s]
[INFO] Flink : Walkthrough : .............................. SUCCESS [ 0.871 s]
[INFO] Flink : Walkthrough : Common ....................... SUCCESS [ 4.019 s]
[INFO] Flink : Walkthrough : Datastream Java .............. SUCCESS [ 1.421 s]
[INFO] Flink : Walkthrough : Datastream Scala ............. SUCCESS [ 1.426 s]
[INFO] Flink : Tools : CI : Java .......................... SUCCESS [ 1.850 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 18:12 min
[INFO] Finished at: 2024-07-23T21:18:32+08:00
[INFO] ------------------------------------------------------------------------
docker-centos Compile
- Bigtop Series - How to build RPM/DEB packages with Bigtop./video/BV1DL411X7xZ
- Download the compilation environment:
docker pull /bigtop/slaves:3.2.0-centos-7
# Place the local code directoryG:\OpenSource\Data\platform\bigtopmap towscatalogs
# commander-in-chief (military)maven本地仓库文件catalogsmap to/root,after the convenience ofmaven、grandle、antutilization
docker run -d -it -p 8000:8000 --network ambari -v G:\OpenSource\Data\platform\bigtop:/ws -v F:\docker\data\bigtop:/root --workdir /ws --name repo bigtop/slaves:3.2.0-centos-7
docker pull /bigtop/slaves:3.2.0-centos-7
docker pull /bigtop/slaves:trunk-centos-7
docker pull /bigtop/puppet:trunk-centos-7
docker pull mariadb:10.2
docker pull centos:7
docker pull mysql:5.7
bigtop source code analysis
- task list
- packages-help: All package build related tasks information
- bom-json: List the components of the stack in json format
- all-components: List the components of the stack
- Individual components: e.g. zookeeper
- ${component}-download: Download $component artifacts
- ${component}-tar: Preparing a tarball for $component artifacts
- $component-deb:Building DEB for $component artifacts
- $component-sdeb:Building SDEB for $component artifacts
- $component-rpm:Building RPM for $component artifacts
- $component-srpm:Building SRPM for $component artifacts
- $component-pkg:Invoking a native binary packaging component $ptype
- $component-spkg:Invoking a native binary packaging component s$ptype
- $component-pkg-ind: Invoking a native binary packaging for $component in Docker
- $component-version: Show version of $component component
- ${component}_vardefines: variable definitions
- $component-info: Info about $component component build
- $component-relnotes: Preparing release notes for $component. No yet implemented!!!
- $component-clean: Removing $component component build and output directories
- $component-help: List of available tasks for $component
- All Components
- srpm:Build all SRPM packages for the stack components
- rpm: Build all RPM packages for the stack
- sdeb:Build all SDEB packages for the stack components
- deb: Build all DEB packages for the stack components
- pkgs:Build all native packages for the stack components
- pkgs-ind: Build all native packages for the stack components inside Docker
- allclean:Removing $BUILD_DIR, $OUTPUT_DIR, and $DIST_DIR,Cleaning all components' build and output directories
- realclean:Removing $DL_DIR
- apt:Creating APT repository
- yum: Creating YUM repository
- repo:Invoking a native repository target $
- repo-ind: Invoking a native repository in Docker