Hive

Sqoop을 이용하여 MySQL의 데이터를 HIVE로 조회해보자!

케키키케 2020. 12. 15. 20:01

사전작업

SQOOP 설치

bachong.tistory.com/50

 

Sqoop(1.4.7)를 설치해보자! MySQL5 to Hadoop3

Sqoop 다운로드 archive.apache.org/dist/sqoop/ $ tar -zxvf sqoop-1.4.7.bin__hadoop-2.6.0.tar.gz $ cp -r sqoop-1.4.7.bin__hadoop-2.6.0 /usr/local/sqoop $ sudo chown -R user명:그룹명 /usr/local/sqoop S..

bachong.tistory.com

 

MySQL 설치 및 샘플데이터 세팅

bachong.tistory.com/48

 

MySQL을 설치해보자! (Ubuntu 18.04) + 샘플 데이터 불러오기!

MySQL 설치 $ sudo apt-get update $ sudo apt-get intsall mysql-server 설치 후 usr/bin 경로 아래 mysql 관련 파일들이 생긴 것을 확인할 수 있다. $ ls /usr/bin | grep mysql mysql mysql_config_editor mysq..

bachong.tistory.com

 

HIVE 설치

bachong.tistory.com/49

 

Hive 3.1.2를 설치해보자! + HADOOP 3.2.1 연동

$ sudo chown -R hadoopuser:hadoopuser /usr/local/hive $ cd $HIVE_HOME/conf $ cp hive-env.sh.template hive-env.sh //HADOOP_HOME에 하둡이 설치된 경로를 입력한다. $ cat hive-env.sh | grep HADOOP_HOME #..

bachong.tistory.com

 

HDFS 디렉토리 생성

Hive테이블을 생성하기 전 HDFS에 /tmp와 /user/hive/warehouse 디렉토리가 있어야 한다.

$ hadoop fs -mkdir /tmp
$ hadoop fs -mkdir -p /user/hive/warehouse
$ hadoop fs -chmod g+w /tmp
$ hadoop fs -chmod -R g+w /user

 

MySQL(Metadata Store) 설정

metadata store db 생성 및 계정 권한 부여

$ sudo mysql
[sudo] password for hadoopuser:
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 127
Server version: 5.7.32-0ubuntu0.18.04.1 (Ubuntu)

Copyright (c) 2000, 2020, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql> create database metastore default character set utf8;
Query OK, 1 row affected (0.00 sec)

mysql> grant all privileges on metastore.* to 'hadoopuser'@'호스트';
Query OK, 0 rows affected (0.00 sec)

mysql> flush privileges;
Query OK, 0 rows affected (0.00 sec)

 

MySQL Connector 추가

 wget http://www.java2s.com/Code/JarDownload/mysql/mysql-connector-java-commercial-5.1.7-bin.jar.zip
 unzip mysql-connector-java-commercial-5.1.7-bin.jar.zip
 cp mysql-connector-java-commercial-5.1.7-bin.jar  $HIVE_HOME/lib/

 

 

Hive $HIVE_HOME/conf/hive-site.xml 수정

hive-site.xml에 MySQL 관련 설정을 추가한다.

<configuration>
        <property>
                <name>hive.metastore.local</name>
                <value>false</value>
        </property>
        <property>
                <name>hive.metastore.warehouse.dir</name>
                <value>/user/hive/warehouse</value>
        </property>
        <property>
                <name>javax.jdo.option.ConnectionURL</name>
                <value>jdbc:mysql://MySQL서버:3306/metastore?createDatabaseIfNotExist=true&amp;useSSL=false</value>
        </property>
        <property>
                <name>javax.jdo.option.ConnectionDriverName</name>
                <value>com.mysql.jdbc.Driver</value>
        </property>
        <property>
                <name>javax.jdo.option.ConnectionUserName</name>
                <value>MySQL계정</value>
        </property>
        <property>
                <name>javax.jdo.option.ConnectionPassword</name>
                <value>MySQL계정비밀번호</value>
        </property>
</configuration>

 

스키마 초기화

schematool 위치 : $HIVE_HOME/bin

$ schematool -dbType mysql -initSchema 

 

MySQL metastore 확인

$ mysql -h 호스트 -u hadoopuser -p
Enter password:
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 131
Server version: 5.7.32-0ubuntu0.18.04.1 (Ubuntu)

Copyright (c) 2000, 2020, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql> show databases;
+--------------------+
| Database           |
+--------------------+
| information_schema |
| employees          |
| metastore          |
| mysql              |
| performance_schema |
| sys                |
+--------------------+
6 rows in set (0.00 sec)

mysql> use metastore;
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A

Database changed
mysql> show tables;
+-------------------------------+
| Tables_in_metastore           |
+-------------------------------+
| AUX_TABLE                     |
| BUCKETING_COLS                |
| CDS                           |
| COLUMNS_V2                    |
| COMPACTION_QUEUE              |
| COMPLETED_COMPACTIONS         |
| COMPLETED_TXN_COMPONENTS      |
| CTLGS                         |
| DATABASE_PARAMS               |
| DBS                           |
| DB_PRIVS                      |
| DELEGATION_TOKENS             |
| FUNCS                         |
| FUNC_RU                       |
| GLOBAL_PRIVS                  |
| HIVE_LOCKS                    |
| IDXS                          |
| INDEX_PARAMS                  |
| I_SCHEMA                      |
| KEY_CONSTRAINTS               |
| MASTER_KEYS                   |
| MATERIALIZATION_REBUILD_LOCKS |
| METASTORE_DB_PROPERTIES       |
| MIN_HISTORY_LEVEL             |
| MV_CREATION_METADATA          |
| MV_TABLES_USED                |
| NEXT_COMPACTION_QUEUE_ID      |
| NEXT_LOCK_ID                  |
| NEXT_TXN_ID                   |
| NEXT_WRITE_ID                 |
| NOTIFICATION_LOG              |
| NOTIFICATION_SEQUENCE         |
| NUCLEUS_TABLES                |
| PARTITIONS                    |
| PARTITION_EVENTS              |
| PARTITION_KEYS                |
| PARTITION_KEY_VALS            |
| PARTITION_PARAMS              |
| PART_COL_PRIVS                |
| PART_COL_STATS                |
| PART_PRIVS                    |
| REPL_TXN_MAP                  |
| ROLES                         |
| ROLE_MAP                      |
| RUNTIME_STATS                 |
| SCHEMA_VERSION                |
| SDS                           |
| SD_PARAMS                     |
| SEQUENCE_TABLE                |
| SERDES                        |
| SERDE_PARAMS                  |
| SKEWED_COL_NAMES              |
| SKEWED_COL_VALUE_LOC_MAP      |
| SKEWED_STRING_LIST            |
| SKEWED_STRING_LIST_VALUES     |
| SKEWED_VALUES                 |
| SORT_COLS                     |
| TABLE_PARAMS                  |
| TAB_COL_STATS                 |
| TBLS                          |
| TBL_COL_PRIVS                 |
| TBL_PRIVS                     |
| TXNS                          |
| TXN_COMPONENTS                |
| TXN_TO_WRITE_ID               |
| TYPES                         |
| TYPE_FIELDS                   |
| VERSION                       |
| WM_MAPPING                    |
| WM_POOL                       |
| WM_POOL_TO_TRIGGER            |
| WM_RESOURCEPLAN               |
| WM_TRIGGER                    |
| WRITE_SET                     |
+-------------------------------+
74 rows in set (0.00 sec)

 

 

MySQL to HIVE!

$ sqoop import --connect jdbc:mysql://MySQL서버:3306/employees --table departments --hive-import --username hadoopuser -P

ERROR HIVE_CONF_DIR은 잘 설정되어있는데 자꾸 확인하란다. 이 바보같은 스쿱! 이 멍청이!!

2020-12-15 02:35:00,633 ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.
2020-12-15 02:35:00,633 ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
        at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50)
        at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392)
        at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379)
        at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)
        at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:264)
        at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:44)
        ... 12 more

 

검색해보니 hive-common.jar를 다운받아서 스쿱 라이브러리에 넣으래서 한번 해본다.

스쿱 라이브러리 경로 : $SQOOP_HOME/lib

jar파일은 아래 링크에서 다운받을 수 있다.

www.java2s.com/Code/Jar/h/Downloadhivecommon0100jar.htm

 

Download hive-common-0.10.0.jar : hive « h « Jar File Download

java2s.com  | © Demo Source and Support. All rights reserved.

www.java2s.com

 

jar파일 추가하고 같은 방법으로 재실행했더니 뭐가 엄청나게 많이 나오더니 마지막에 컴플리트란다.

스쿱은 왜케 필요한 jar파일들을 안넣어 놓는걸까? 넘 짜증난다 .ㅎ

$ sqoop import --connect jdbc:mysql://MySQL서버:3306/employees --table departments --hive-import --create-hive-table --hive-table test.departments --username hadoopuser -P

*하이브 테이블도 지정했다. test는 DB명, departments는 테이블명. (DB는 이미 생성되어 있는 상태)

$ sqoop import --connect jdbc:mysql://MySQL서버:3306/employees --table departments --hive-import --create-hive-table --hive-table test.departments --username hadoopuser -P
Warning: /usr/local/sqoop/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /usr/local/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/local/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /usr/local/sqoop/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
/usr/local/hadoop/libexec/hadoop-functions.sh: line 2366: HADOOP_ORG.APACHE.SQOOP.SQOOP_USER: bad substitution
/usr/local/hadoop/libexec/hadoop-functions.sh: line 2461: HADOOP_ORG.APACHE.SQOOP.SQOOP_OPTS: bad substitution
2020-12-15 19:53:33,204 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
Enter password:
2020-12-15 19:53:34,962 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
2020-12-15 19:53:34,962 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
2020-12-15 19:53:35,027 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
2020-12-15 19:53:35,032 INFO tool.CodeGenTool: Beginning code generation
Tue Dec 15 19:53:35 KST 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
2020-12-15 19:53:35,280 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `departments` AS t LIMIT 1
2020-12-15 19:53:35,292 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `departments` AS t LIMIT 1
2020-12-15 19:53:35,296 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop
Note: /tmp/sqoop-hadoopuser/compile/6bf69c5d5082d9ccae941bb1fcc46109/departments.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
2020-12-15 19:53:36,243 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoopuser/compile/6bf69c5d5082d9ccae941bb1fcc46109/departments.jar
2020-12-15 19:53:36,251 WARN manager.MySQLManager: It looks like you are importing from mysql.
2020-12-15 19:53:36,251 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
2020-12-15 19:53:36,251 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
2020-12-15 19:53:36,252 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
2020-12-15 19:53:36,254 INFO mapreduce.ImportJobBase: Beginning import of departments
2020-12-15 19:53:36,255 INFO Configuration.deprecation: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
2020-12-15 19:53:36,336 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
2020-12-15 19:53:36,746 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
2020-12-15 19:53:36,805 INFO impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties
2020-12-15 19:53:36,859 INFO impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s).
2020-12-15 19:53:36,859 INFO impl.MetricsSystemImpl: JobTracker metrics system started
Tue Dec 15 19:53:37 KST 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
2020-12-15 19:53:37,069 INFO db.DBInputFormat: Using read commited transaction isolation
2020-12-15 19:53:37,080 INFO mapreduce.JobSubmitter: number of splits:1
2020-12-15 19:53:37,154 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_local331843603_0001
2020-12-15 19:53:37,154 INFO mapreduce.JobSubmitter: Executing with tokens: []
2020-12-15 19:53:37,289 INFO mapred.LocalDistributedCacheManager: Creating symlink: /tmp/hadoop-hadoopuser/mapred/local/job_local331843603_0001_d0f119f6-874a-4703-b66a-65d1ea946977/libjars <- /usr/local/libjars/*
2020-12-15 19:53:37,296 WARN fs.FileUtil: Command 'ln -s /tmp/hadoop-hadoopuser/mapred/local/job_local331843603_0001_d0f119f6-874a-4703-b66a-65d1ea946977/libjars /usr/local/libjars/*' failed 1 with: ln: failed to create symbolic link '/usr/local/libjars/*': No such file or directory

2020-12-15 19:53:37,296 WARN mapred.LocalDistributedCacheManager: Failed to create symlink: /tmp/hadoop-hadoopuser/mapred/local/job_local331843603_0001_d0f119f6-874a-4703-b66a-65d1ea946977/libjars <- /usr/local/libjars/*
2020-12-15 19:53:37,296 INFO mapred.LocalDistributedCacheManager: Localized file:/tmp/hadoop/mapred/staging/hadoopuser331843603/.staging/job_local331843603_0001/libjars as file:/tmp/hadoop-hadoopuser/mapred/local/job_local331843603_0001_d0f119f6-874a-4703-b66a-65d1ea946977/libjars
2020-12-15 19:53:37,330 INFO mapreduce.Job: The url to track the job: http://localhost:8080/
2020-12-15 19:53:37,331 INFO mapreduce.Job: Running job: job_local331843603_0001
2020-12-15 19:53:37,333 INFO mapred.LocalJobRunner: OutputCommitter set in config null
2020-12-15 19:53:37,338 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 2
2020-12-15 19:53:37,338 INFO output.FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
2020-12-15 19:53:37,338 INFO mapred.LocalJobRunner: OutputCommitter is org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
2020-12-15 19:53:37,382 INFO mapred.LocalJobRunner: Waiting for map tasks
2020-12-15 19:53:37,383 INFO mapred.LocalJobRunner: Starting task: attempt_local331843603_0001_m_000000_0
2020-12-15 19:53:37,404 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 2
2020-12-15 19:53:37,405 INFO output.FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
2020-12-15 19:53:37,425 INFO mapred.Task:  Using ResourceCalculatorProcessTree : [ ]
Tue Dec 15 19:53:37 KST 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
2020-12-15 19:53:37,430 INFO db.DBInputFormat: Using read commited transaction isolation
2020-12-15 19:53:37,433 INFO mapred.MapTask: Processing split: 1=1 AND 1=1
2020-12-15 19:53:37,480 INFO db.DBRecordReader: Working on split: 1=1 AND 1=1
2020-12-15 19:53:37,481 INFO db.DBRecordReader: Executing query: SELECT `dept_no`, `dept_name` FROM `departments` AS `departments` WHERE ( 1=1 ) AND ( 1=1 )
2020-12-15 19:53:37,484 INFO mapreduce.AutoProgressMapper: Auto-progress thread is finished. keepGoing=false
2020-12-15 19:53:37,486 INFO mapred.LocalJobRunner:
2020-12-15 19:53:37,513 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-12-15 19:53:37,596 INFO mapred.Task: Task:attempt_local331843603_0001_m_000000_0 is done. And is in the process of committing
2020-12-15 19:53:37,599 INFO mapred.LocalJobRunner:
2020-12-15 19:53:37,599 INFO mapred.Task: Task attempt_local331843603_0001_m_000000_0 is allowed to commit now
2020-12-15 19:53:37,614 INFO output.FileOutputCommitter: Saved output of task 'attempt_local331843603_0001_m_000000_0' to hdfs://hadoop1:9000/user/hadoopuser/departments
2020-12-15 19:53:37,615 INFO mapred.LocalJobRunner: map
2020-12-15 19:53:37,615 INFO mapred.Task: Task 'attempt_local331843603_0001_m_000000_0' done.
2020-12-15 19:53:37,621 INFO mapred.Task: Final Counters for attempt_local331843603_0001_m_000000_0: Counters: 21
        File System Counters
                FILE: Number of bytes read=5160
                FILE: Number of bytes written=546234
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=0
                HDFS: Number of bytes written=153
                HDFS: Number of read operations=6
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=3
                HDFS: Number of bytes read erasure-coded=0
        Map-Reduce Framework
                Map input records=9
                Map output records=9
                Input split bytes=87
                Spilled Records=0
                Failed Shuffles=0
                Merged Map outputs=0
                GC time elapsed (ms)=0
                Total committed heap usage (bytes)=352845824
        File Input Format Counters
                Bytes Read=0
        File Output Format Counters
                Bytes Written=153
2020-12-15 19:53:37,621 INFO mapred.LocalJobRunner: Finishing task: attempt_local331843603_0001_m_000000_0
2020-12-15 19:53:37,621 INFO mapred.LocalJobRunner: map task executor complete.
2020-12-15 19:53:38,335 INFO mapreduce.Job: Job job_local331843603_0001 running in uber mode : false
2020-12-15 19:53:38,336 INFO mapreduce.Job:  map 100% reduce 0%
2020-12-15 19:53:38,338 INFO mapreduce.Job: Job job_local331843603_0001 completed successfully
2020-12-15 19:53:38,343 INFO mapreduce.Job: Counters: 21
        File System Counters
                FILE: Number of bytes read=5160
                FILE: Number of bytes written=546234
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=0
                HDFS: Number of bytes written=153
                HDFS: Number of read operations=6
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=3
                HDFS: Number of bytes read erasure-coded=0
        Map-Reduce Framework
                Map input records=9
                Map output records=9
                Input split bytes=87
                Spilled Records=0
                Failed Shuffles=0
                Merged Map outputs=0
                GC time elapsed (ms)=0
                Total committed heap usage (bytes)=352845824
        File Input Format Counters
                Bytes Read=0
        File Output Format Counters
                Bytes Written=153
2020-12-15 19:53:38,350 INFO mapreduce.ImportJobBase: Transferred 153 bytes in 1.5941 seconds (95.9771 bytes/sec)
2020-12-15 19:53:38,350 INFO mapreduce.ImportJobBase: Retrieved 9 records.
2020-12-15 19:53:38,351 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table departments
Tue Dec 15 19:53:38 KST 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
2020-12-15 19:53:38,366 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `departments` AS t LIMIT 1
2020-12-15 19:53:38,373 INFO hive.HiveImport: Loading uploaded data into Hive
2020-12-15 19:53:38,380 WARN conf.HiveConf: hive-site.xml not found on CLASSPATH
2020-12-15 19:53:38,412 INFO Configuration.deprecation: mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive
2020-12-15 19:53:38,412 INFO Configuration.deprecation: mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize
2020-12-15 19:53:38,412 INFO Configuration.deprecation: mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize
2020-12-15 19:53:38,412 INFO Configuration.deprecation: mapred.min.split.size.per.rack is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack
2020-12-15 19:53:38,412 INFO Configuration.deprecation: mapred.min.split.size.per.node is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node
2020-12-15 19:53:38,412 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
2020-12-15 19:53:38,412 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
2020-12-15 19:53:39,778 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings.
2020-12-15 19:53:39,779 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2020-12-15 19:53:39,779 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2020-12-15 19:53:39,779 INFO hive.HiveImport: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2020-12-15 19:53:39,844 INFO hive.HiveImport: SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2020-12-15 19:53:40,141 INFO hive.HiveImport: 2020-12-15 19:53:40,139 INFO  [main] conf.HiveConf (HiveConf.java:findConfigFile(187)) - Found configuration file file:/usr/local/hive/conf/hive-site.xml
2020-12-15 19:53:40,379 INFO hive.HiveImport: 2020-12-15 19:53:40,377 WARN  [main] conf.HiveConf (HiveConf.java:initialize(5220)) - HiveConf of name hive.metastore.local does not exist
2020-12-15 19:53:41,924 INFO hive.HiveImport: 2020-12-15 19:53:41,924 WARN  [main] conf.HiveConf (HiveConf.java:initialize(5220)) - HiveConf of name hive.metastore.local does not exist
2020-12-15 19:53:41,932 INFO hive.HiveImport: Hive Session ID = 63eed2d2-8c51-47fc-acb7-6f22c42aa3e4
2020-12-15 19:53:41,932 INFO hive.HiveImport: 2020-12-15 19:53:41,932 INFO  [main] SessionState (SessionState.java:printInfo(1227)) - Hive Session ID = 63eed2d2-8c51-47fc-acb7-6f22c42aa3e4
2020-12-15 19:53:41,987 INFO hive.HiveImport:
2020-12-15 19:53:41,987 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/usr/local/hive/lib/hive-common-3.1.2.jar!/hive-log4j2.properties Async: true
2020-12-15 19:53:41,987 INFO hive.HiveImport: 2020-12-15 19:53:41,987 INFO  [main] SessionState (SessionState.java:printInfo(1227)) -
2020-12-15 19:53:41,987 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/usr/local/hive/lib/hive-common-3.1.2.jar!/hive-log4j2.properties Async: true
2020-12-15 19:53:42,881 INFO hive.HiveImport: 2020-12-15 19:53:42,881 INFO  [main] session.SessionState (SessionState.java:createPath(790)) - Created HDFS directory: /tmp/hive/hadoopuser/63eed2d2-8c51-47fc-acb7-6f22c42aa3e4
2020-12-15 19:53:42,898 INFO hive.HiveImport: 2020-12-15 19:53:42,898 INFO  [main] session.SessionState (SessionState.java:createPath(790)) - Created local directory: /tmp/hadoopuser/63eed2d2-8c51-47fc-acb7-6f22c42aa3e4
2020-12-15 19:53:42,903 INFO hive.HiveImport: 2020-12-15 19:53:42,902 INFO  [main] session.SessionState (SessionState.java:createPath(790)) - Created HDFS directory: /tmp/hive/hadoopuser/63eed2d2-8c51-47fc-acb7-6f22c42aa3e4/_tmp_space.db
2020-12-15 19:53:42,915 INFO hive.HiveImport: 2020-12-15 19:53:42,915 INFO  [main] conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: 63eed2d2-8c51-47fc-acb7-6f22c42aa3e4
2020-12-15 19:53:42,915 INFO hive.HiveImport: 2020-12-15 19:53:42,915 INFO  [main] session.SessionState (SessionState.java:updateThreadName(441)) - Updating thread name to 63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main
2020-12-15 19:53:42,956 INFO hive.HiveImport: 2020-12-15 19:53:42,955 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] conf.HiveConf (HiveConf.java:initialize(5220)) - HiveConf of name hive.metastore.local does not exist
2020-12-15 19:53:43,492 INFO hive.HiveImport: 2020-12-15 19:53:43,492 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:newRawStoreForConf(717)) - 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2020-12-15 19:53:43,516 INFO hive.HiveImport: 2020-12-15 19:53:43,516 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:correctAutoStartMechanism(638)) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2020-12-15 19:53:43,522 INFO hive.HiveImport: 2020-12-15 19:53:43,522 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:initializeHelper(481)) - ObjectStore, initialize called
2020-12-15 19:53:43,523 INFO hive.HiveImport: 2020-12-15 19:53:43,523 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] conf.MetastoreConf (MetastoreConf.java:findConfigFile(1240)) - Found configuration file file:/usr/local/hive/conf/hive-site.xml
2020-12-15 19:53:43,525 INFO hive.HiveImport: 2020-12-15 19:53:43,524 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] conf.MetastoreConf (MetastoreConf.java:findConfigFile(1233)) - Unable to find config file hivemetastore-site.xml
2020-12-15 19:53:43,525 INFO hive.HiveImport: 2020-12-15 19:53:43,525 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] conf.MetastoreConf (MetastoreConf.java:findConfigFile(1240)) - Found configuration file null
2020-12-15 19:53:43,526 INFO hive.HiveImport: 2020-12-15 19:53:43,526 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] conf.MetastoreConf (MetastoreConf.java:findConfigFile(1233)) - Unable to find config file metastore-site.xml
2020-12-15 19:53:43,526 INFO hive.HiveImport: 2020-12-15 19:53:43,526 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] conf.MetastoreConf (MetastoreConf.java:findConfigFile(1240)) - Found configuration file null
2020-12-15 19:53:43,659 INFO hive.HiveImport: 2020-12-15 19:53:43,658 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] DataNucleus.Persistence (Log4JLogger.java:info(77)) - Property datanucleus.cache.level2 unknown - will be ignored
2020-12-15 19:53:43,799 INFO hive.HiveImport: 2020-12-15 19:53:43,799 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] hikari.HikariDataSource (HikariDataSource.java:<init>(71)) - HikariPool-1 - Starting...
2020-12-15 19:53:44,121 INFO hive.HiveImport: 2020-12-15 19:53:44,121 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] pool.PoolBase (PoolBase.java:getAndSetNetworkTimeout(503)) - HikariPool-1 - Driver does not support get/set network timeout for connections. (com.mysql.jdbc.JDBC4Connection.getNetworkTimeout()I)
2020-12-15 19:53:44,124 INFO hive.HiveImport: 2020-12-15 19:53:44,124 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] hikari.HikariDataSource (HikariDataSource.java:<init>(73)) - HikariPool-1 - Start completed.
2020-12-15 19:53:44,155 INFO hive.HiveImport: 2020-12-15 19:53:44,155 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] hikari.HikariDataSource (HikariDataSource.java:<init>(71)) - HikariPool-2 - Starting...
2020-12-15 19:53:44,165 INFO hive.HiveImport: 2020-12-15 19:53:44,165 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] pool.PoolBase (PoolBase.java:getAndSetNetworkTimeout(503)) - HikariPool-2 - Driver does not support get/set network timeout for connections. (com.mysql.jdbc.JDBC4Connection.getNetworkTimeout()I)
2020-12-15 19:53:44,166 INFO hive.HiveImport: 2020-12-15 19:53:44,166 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] hikari.HikariDataSource (HikariDataSource.java:<init>(73)) - HikariPool-2 - Start completed.
2020-12-15 19:53:44,207 INFO hive.HiveImport: 2020-12-15 19:53:44,207 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:getPMF(670)) - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2020-12-15 19:53:44,291 INFO hive.HiveImport: 2020-12-15 19:53:44,291 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(186)) - Using direct SQL, underlying DB is MYSQL
2020-12-15 19:53:44,292 INFO hive.HiveImport: 2020-12-15 19:53:44,292 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:setConf(396)) - Initialized ObjectStore
2020-12-15 19:53:44,441 INFO hive.HiveImport: 2020-12-15 19:53:44,441 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] DataNucleus.MetaData (Log4JLogger.java:warn(96)) - Metadata has jdbc-type of null yet this is not valid. Ignored
2020-12-15 19:53:44,442 INFO hive.HiveImport: 2020-12-15 19:53:44,442 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] DataNucleus.MetaData (Log4JLogger.java:warn(96)) - Metadata has jdbc-type of null yet this is not valid. Ignored
2020-12-15 19:53:44,443 INFO hive.HiveImport: 2020-12-15 19:53:44,442 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] DataNucleus.MetaData (Log4JLogger.java:warn(96)) - Metadata has jdbc-type of null yet this is not valid. Ignored
2020-12-15 19:53:44,443 INFO hive.HiveImport: 2020-12-15 19:53:44,443 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] DataNucleus.MetaData (Log4JLogger.java:warn(96)) - Metadata has jdbc-type of null yet this is not valid. Ignored
2020-12-15 19:53:44,443 INFO hive.HiveImport: 2020-12-15 19:53:44,443 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] DataNucleus.MetaData (Log4JLogger.java:warn(96)) - Metadata has jdbc-type of null yet this is not valid. Ignored
2020-12-15 19:53:44,443 INFO hive.HiveImport: 2020-12-15 19:53:44,443 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] DataNucleus.MetaData (Log4JLogger.java:warn(96)) - Metadata has jdbc-type of null yet this is not valid. Ignored
2020-12-15 19:53:45,809 INFO hive.HiveImport: 2020-12-15 19:53:45,809 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] DataNucleus.MetaData (Log4JLogger.java:warn(96)) - Metadata has jdbc-type of null yet this is not valid. Ignored
2020-12-15 19:53:45,810 INFO hive.HiveImport: 2020-12-15 19:53:45,809 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] DataNucleus.MetaData (Log4JLogger.java:warn(96)) - Metadata has jdbc-type of null yet this is not valid. Ignored
2020-12-15 19:53:45,810 INFO hive.HiveImport: 2020-12-15 19:53:45,810 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] DataNucleus.MetaData (Log4JLogger.java:warn(96)) - Metadata has jdbc-type of null yet this is not valid. Ignored
2020-12-15 19:53:45,810 INFO hive.HiveImport: 2020-12-15 19:53:45,810 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] DataNucleus.MetaData (Log4JLogger.java:warn(96)) - Metadata has jdbc-type of null yet this is not valid. Ignored
2020-12-15 19:53:45,810 INFO hive.HiveImport: 2020-12-15 19:53:45,810 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] DataNucleus.MetaData (Log4JLogger.java:warn(96)) - Metadata has jdbc-type of null yet this is not valid. Ignored
2020-12-15 19:53:45,811 INFO hive.HiveImport: 2020-12-15 19:53:45,810 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] DataNucleus.MetaData (Log4JLogger.java:warn(96)) - Metadata has jdbc-type of null yet this is not valid. Ignored
2020-12-15 19:53:47,754 INFO hive.HiveImport: 2020-12-15 19:53:47,754 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:createDefaultRoles_core(812)) - Added admin role in metastore
2020-12-15 19:53:47,756 INFO hive.HiveImport: 2020-12-15 19:53:47,756 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:createDefaultRoles_core(821)) - Added public role in metastore
2020-12-15 19:53:47,772 INFO hive.HiveImport: 2020-12-15 19:53:47,772 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:addAdminUsers_core(861)) - No user is added in admin role, since config is empty
2020-12-15 19:53:47,881 INFO hive.HiveImport: 2020-12-15 19:53:47,881 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.RetryingMetaStoreClient (RetryingMetaStoreClient.java:<init>(97)) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hadoopuser (auth:SIMPLE) retries=1 delay=1 lifetime=0
2020-12-15 19:53:47,898 INFO hive.HiveImport: 2020-12-15 19:53:47,898 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: get_all_functions
2020-12-15 19:53:47,899 INFO hive.HiveImport: 2020-12-15 19:53:47,899 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=get_all_functions
2020-12-15 19:53:47,922 INFO hive.HiveImport: Hive Session ID = 833254ed-dce3-4ae4-8f01-6e34eaba1a4b
2020-12-15 19:53:47,923 INFO hive.HiveImport: 2020-12-15 19:53:47,922 INFO  [pool-10-thread-1] SessionState (SessionState.java:printInfo(1227)) - Hive Session ID = 833254ed-dce3-4ae4-8f01-6e34eaba1a4b
2020-12-15 19:53:47,930 INFO hive.HiveImport: 2020-12-15 19:53:47,930 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: 63eed2d2-8c51-47fc-acb7-6f22c42aa3e4
2020-12-15 19:53:47,940 INFO hive.HiveImport: 2020-12-15 19:53:47,940 INFO  [pool-10-thread-1] session.SessionState (SessionState.java:createPath(790)) - Created HDFS directory: /tmp/hive/hadoopuser/833254ed-dce3-4ae4-8f01-6e34eaba1a4b
2020-12-15 19:53:47,942 INFO hive.HiveImport: 2020-12-15 19:53:47,942 INFO  [pool-10-thread-1] session.SessionState (SessionState.java:createPath(790)) - Created local directory: /tmp/hadoopuser/833254ed-dce3-4ae4-8f01-6e34eaba1a4b
2020-12-15 19:53:47,946 INFO hive.HiveImport: 2020-12-15 19:53:47,946 INFO  [pool-10-thread-1] session.SessionState (SessionState.java:createPath(790)) - Created HDFS directory: /tmp/hive/hadoopuser/833254ed-dce3-4ae4-8f01-6e34eaba1a4b/_tmp_space.db
2020-12-15 19:53:47,946 INFO hive.HiveImport: 2020-12-15 19:53:47,946 INFO  [pool-10-thread-1] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: get_databases: @hive#
2020-12-15 19:53:47,947 INFO hive.HiveImport: 2020-12-15 19:53:47,947 INFO  [pool-10-thread-1] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser     ip=unknown-ip-addr    cmd=get_databases: @hive#
2020-12-15 19:53:47,948 INFO hive.HiveImport: 2020-12-15 19:53:47,948 INFO  [pool-10-thread-1] metastore.HiveMetaStore (HiveMetaStore.java:newRawStoreForConf(717)) - 1: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2020-12-15 19:53:47,948 INFO hive.HiveImport: 2020-12-15 19:53:47,948 INFO  [pool-10-thread-1] metastore.ObjectStore (ObjectStore.java:initializeHelper(481)) - ObjectStore, initialize called
2020-12-15 19:53:47,960 INFO hive.HiveImport: 2020-12-15 19:53:47,959 INFO  [pool-10-thread-1] metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(186)) - Using direct SQL, underlying DB is MYSQL
2020-12-15 19:53:47,960 INFO hive.HiveImport: 2020-12-15 19:53:47,960 INFO  [pool-10-thread-1] metastore.ObjectStore (ObjectStore.java:setConf(396)) - Initialized ObjectStore
2020-12-15 19:53:47,967 INFO hive.HiveImport: 2020-12-15 19:53:47,967 INFO  [pool-10-thread-1] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: get_tables_by_type: db=@hive#default pat=.*,type=MATERIALIZED_VIEW
2020-12-15 19:53:47,967 INFO hive.HiveImport: 2020-12-15 19:53:47,967 INFO  [pool-10-thread-1] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser     ip=unknown-ip-addr    cmd=get_tables_by_type: db=@hive#default pat=.*,type=MATERIALIZED_VIEW
2020-12-15 19:53:47,971 INFO hive.HiveImport: 2020-12-15 19:53:47,971 INFO  [pool-10-thread-1] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: get_multi_table : db=default tbls=
2020-12-15 19:53:47,971 INFO hive.HiveImport: 2020-12-15 19:53:47,971 INFO  [pool-10-thread-1] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser     ip=unknown-ip-addr    cmd=get_multi_table : db=default tbls=
2020-12-15 19:53:47,983 INFO hive.HiveImport: 2020-12-15 19:53:47,973 INFO  [pool-10-thread-1] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: get_tables_by_type: db=@hive#sqoop pat=.*,type=MATERIALIZED_VIEW
2020-12-15 19:53:47,983 INFO hive.HiveImport: 2020-12-15 19:53:47,983 INFO  [pool-10-thread-1] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser     ip=unknown-ip-addr    cmd=get_tables_by_type: db=@hive#sqoop pat=.*,type=MATERIALIZED_VIEW
2020-12-15 19:53:47,986 INFO hive.HiveImport: 2020-12-15 19:53:47,986 INFO  [pool-10-thread-1] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: get_multi_table : db=sqoop tbls=
2020-12-15 19:53:47,987 INFO hive.HiveImport: 2020-12-15 19:53:47,986 INFO  [pool-10-thread-1] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser     ip=unknown-ip-addr    cmd=get_multi_table : db=sqoop tbls=
2020-12-15 19:53:47,987 INFO hive.HiveImport: 2020-12-15 19:53:47,987 INFO  [pool-10-thread-1] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: get_tables_by_type: db=@hive#test pat=.*,type=MATERIALIZED_VIEW
2020-12-15 19:53:47,987 INFO hive.HiveImport: 2020-12-15 19:53:47,987 INFO  [pool-10-thread-1] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser     ip=unknown-ip-addr    cmd=get_tables_by_type: db=@hive#test pat=.*,type=MATERIALIZED_VIEW
2020-12-15 19:53:47,990 INFO hive.HiveImport: 2020-12-15 19:53:47,989 INFO  [pool-10-thread-1] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: get_multi_table : db=test tbls=
2020-12-15 19:53:47,990 INFO hive.HiveImport: 2020-12-15 19:53:47,990 INFO  [pool-10-thread-1] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser     ip=unknown-ip-addr    cmd=get_multi_table : db=test tbls=
2020-12-15 19:53:47,990 INFO hive.HiveImport: 2020-12-15 19:53:47,990 INFO  [pool-10-thread-1] metadata.HiveMaterializedViewsRegistry (HiveMaterializedViewsRegistry.java:run(171)) - Materialized views registry has been initialized
2020-12-15 19:53:47,998 INFO hive.HiveImport: 2020-12-15 19:53:47,998 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:compile(554)) - Compiling command(queryId=hadoopuser_20201215195347_108c2a9a-c30a-4bb3-96f7-c04465046115): CREATE TABLE `test.departments` ( `dept_no` STRING, `dept_name` STRING) COMMENT 'Imported by sqoop on 2020/12/15 19:53:38' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE
2020-12-15 19:53:48,624 INFO hive.HiveImport: 2020-12-15 19:53:48,624 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:checkConcurrency(285)) - Concurrency mode is disabled, not creating a lock manager
2020-12-15 19:53:48,627 INFO hive.HiveImport: 2020-12-15 19:53:48,627 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] parse.CalcitePlanner (SemanticAnalyzer.java:analyzeInternal(12123)) - Starting Semantic Analysis
2020-12-15 19:53:48,642 INFO hive.HiveImport: 2020-12-15 19:53:48,642 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] sqlstd.SQLStdHiveAccessController (SQLStdHiveAccessController.java:<init>(96)) - Created SQLStdHiveAccessController for session context : HiveAuthzSessionContext [sessionString=63eed2d2-8c51-47fc-acb7-6f22c42aa3e4, clientType=HIVECLI]
2020-12-15 19:53:48,644 INFO hive.HiveImport: 2020-12-15 19:53:48,644 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] session.SessionState (SessionState.java:setAuthorizerV2Config(950)) - METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory.
2020-12-15 19:53:48,644 INFO hive.HiveImport: 2020-12-15 19:53:48,644 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStoreClient (HiveMetaStoreClient.java:isCompatibleWith(346)) - Mestastore configuration metastore.filter.hook changed from org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl to org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook
2020-12-15 19:53:48,646 INFO hive.HiveImport: 2020-12-15 19:53:48,645 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: Cleaning up thread local RawStore...
2020-12-15 19:53:48,646 INFO hive.HiveImport: 2020-12-15 19:53:48,646 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=Cleaning up thread local RawStore...
2020-12-15 19:53:48,646 INFO hive.HiveImport: 2020-12-15 19:53:48,646 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: Done cleaning up thread local RawStore
2020-12-15 19:53:48,646 INFO hive.HiveImport: 2020-12-15 19:53:48,646 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=Done cleaning up thread local RawStore
2020-12-15 19:53:48,649 INFO hive.HiveImport: 2020-12-15 19:53:48,648 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:newRawStoreForConf(717)) - 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2020-12-15 19:53:48,649 INFO hive.HiveImport: 2020-12-15 19:53:48,649 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:correctAutoStartMechanism(638)) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2020-12-15 19:53:48,649 INFO hive.HiveImport: 2020-12-15 19:53:48,649 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:initializeHelper(481)) - ObjectStore, initialize called
2020-12-15 19:53:48,653 INFO hive.HiveImport: 2020-12-15 19:53:48,653 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(186)) - Using direct SQL, underlying DB is MYSQL
2020-12-15 19:53:48,653 INFO hive.HiveImport: 2020-12-15 19:53:48,653 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:setConf(396)) - Initialized ObjectStore
2020-12-15 19:53:48,653 INFO hive.HiveImport: 2020-12-15 19:53:48,653 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.RetryingMetaStoreClient (RetryingMetaStoreClient.java:<init>(97)) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hadoopuser (auth:SIMPLE) retries=1 delay=1 lifetime=0
2020-12-15 19:53:48,661 INFO hive.HiveImport: 2020-12-15 19:53:48,661 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] parse.CalcitePlanner (SemanticAnalyzer.java:analyzeCreateTable(12993)) - Creating table test.departments position=13
2020-12-15 19:53:48,678 INFO hive.HiveImport: 2020-12-15 19:53:48,677 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:correctAutoStartMechanism(638)) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2020-12-15 19:53:48,678 INFO hive.HiveImport: 2020-12-15 19:53:48,678 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:initializeHelper(481)) - ObjectStore, initialize called
2020-12-15 19:53:48,682 INFO hive.HiveImport: 2020-12-15 19:53:48,681 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(186)) - Using direct SQL, underlying DB is MYSQL
2020-12-15 19:53:48,682 INFO hive.HiveImport: 2020-12-15 19:53:48,682 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:setConf(396)) - Initialized ObjectStore
2020-12-15 19:53:48,682 INFO hive.HiveImport: 2020-12-15 19:53:48,682 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.RetryingMetaStoreClient (RetryingMetaStoreClient.java:<init>(97)) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hadoopuser (auth:SIMPLE) retries=1 delay=1 lifetime=0
2020-12-15 19:53:48,683 INFO hive.HiveImport: 2020-12-15 19:53:48,682 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: get_database: @hive#test
2020-12-15 19:53:48,683 INFO hive.HiveImport: 2020-12-15 19:53:48,683 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=get_database: @hive#test
2020-12-15 19:53:48,705 INFO hive.HiveImport: 2020-12-15 19:53:48,705 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:compile(666)) - Semantic Analysis Completed (retrial = false)
2020-12-15 19:53:48,711 INFO hive.HiveImport: 2020-12-15 19:53:48,711 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:getSchema(374)) - Returning Hive schema: Schema(fieldSchemas:null, properties:null)
2020-12-15 19:53:48,715 INFO hive.HiveImport: 2020-12-15 19:53:48,715 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:compile(781)) - Completed compiling command(queryId=hadoopuser_20201215195347_108c2a9a-c30a-4bb3-96f7-c04465046115); Time taken: 0.744 seconds
2020-12-15 19:53:48,715 INFO hive.HiveImport: 2020-12-15 19:53:48,715 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] reexec.ReExecDriver (ReExecDriver.java:run(156)) - Execution #1 of query
2020-12-15 19:53:48,716 INFO hive.HiveImport: 2020-12-15 19:53:48,716 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:checkConcurrency(285)) - Concurrency mode is disabled, not creating a lock manager
2020-12-15 19:53:48,716 INFO hive.HiveImport: 2020-12-15 19:53:48,716 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:execute(2255)) - Executing command(queryId=hadoopuser_20201215195347_108c2a9a-c30a-4bb3-96f7-c04465046115): CREATE TABLE `test.departments` ( `dept_no` STRING, `dept_name` STRING) COMMENT 'Imported by sqoop on 2020/12/15 19:53:38' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE
2020-12-15 19:53:48,728 INFO hive.HiveImport: 2020-12-15 19:53:48,728 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:launchTask(2662)) - Starting task [Stage-0:DDL] in serial mode
2020-12-15 19:53:48,728 INFO hive.HiveImport: 2020-12-15 19:53:48,728 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStoreClient (HiveMetaStoreClient.java:isCompatibleWith(346)) - Mestastore configuration metastore.filter.hook changed from org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook to org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl
2020-12-15 19:53:48,729 INFO hive.HiveImport: 2020-12-15 19:53:48,729 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: Cleaning up thread local RawStore...
2020-12-15 19:53:48,729 INFO hive.HiveImport: 2020-12-15 19:53:48,729 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=Cleaning up thread local RawStore...
2020-12-15 19:53:48,729 INFO hive.HiveImport: 2020-12-15 19:53:48,729 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: Done cleaning up thread local RawStore
2020-12-15 19:53:48,729 INFO hive.HiveImport: 2020-12-15 19:53:48,729 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=Done cleaning up thread local RawStore
2020-12-15 19:53:49,000 INFO hive.HiveImport: 2020-12-15 19:53:48,994 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:newRawStoreForConf(717)) - 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2020-12-15 19:53:49,001 INFO hive.HiveImport: 2020-12-15 19:53:48,995 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:correctAutoStartMechanism(638)) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2020-12-15 19:53:49,001 INFO hive.HiveImport: 2020-12-15 19:53:48,995 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:initializeHelper(481)) - ObjectStore, initialize called
2020-12-15 19:53:49,001 INFO hive.HiveImport: 2020-12-15 19:53:48,998 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(186)) - Using direct SQL, underlying DB is MYSQL
2020-12-15 19:53:49,001 INFO hive.HiveImport: 2020-12-15 19:53:48,998 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:setConf(396)) - Initialized ObjectStore
2020-12-15 19:53:49,003 INFO hive.HiveImport: 2020-12-15 19:53:48,999 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.RetryingMetaStoreClient (RetryingMetaStoreClient.java:<init>(97)) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hadoopuser (auth:SIMPLE) retries=1 delay=1 lifetime=0
2020-12-15 19:53:49,003 INFO hive.HiveImport: 2020-12-15 19:53:49,000 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: create_table: Table(tableName:departments, dbName:test, owner:hadoopuser, createTime:1608029628, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:dept_no, type:string, comment:null), FieldSchema(name:dept_name, type:string, comment:null)], location:null, inputFormat:org.apache.hadoop.mapred.TextInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, parameters:{line.delim=
2020-12-15 19:53:49,003 INFO hive.HiveImport: , field.delim=, serialization.format=}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{}), storedAsSubDirectories:false), partitionKeys:[], parameters:{totalSize=0, numRows=0, rawDataSize=0, COLUMN_STATS_ACCURATE={"BASIC_STATS":"true","COLUMN_STATS":{"dept_name":"true","dept_no":"true"}}, numFiles=0, bucketing_version=2, comment=Imported by sqoop on 2020/12/15 19:53:38}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE, privileges:PrincipalPrivilegeSet(userPrivileges:{hadoopuser=[PrivilegeGrantInfo(privilege:INSERT, createTime:-1, grantor:hadoopuser, grantorType:USER, grantOption:true), PrivilegeGrantInfo(privilege:SELECT, createTime:-1, grantor:hadoopuser, grantorType:USER, grantOption:true), PrivilegeGrantInfo(privilege:UPDATE, createTime:-1, grantor:hadoopuser, grantorType:USER, grantOption:true), PrivilegeGrantInfo(privilege:DELETE, createTime:-1, grantor:hadoopuser, grantorType:USER, grantOption:true)]}, groupPrivileges:null, rolePrivileges:null), temporary:false, catName:hive, ownerType:USER)
2020-12-15 19:53:49,003 INFO hive.HiveImport: 2020-12-15 19:53:49,000 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=create_table: Table(tableName:departments, dbName:test, owner:hadoopuser, createTime:1608029628, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:dept_no, type:string, comment:null), FieldSchema(name:dept_name, type:string, comment:null)], location:null, inputFormat:org.apache.hadoop.mapred.TextInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, parameters:{line.delim=
2020-12-15 19:53:49,004 INFO hive.HiveImport: , field.delim=, serialization.format=}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{}), storedAsSubDirectories:false), partitionKeys:[], parameters:{totalSize=0, numRows=0, rawDataSize=0, COLUMN_STATS_ACCURATE={"BASIC_STATS":"true","COLUMN_STATS":{"dept_name":"true","dept_no":"true"}}, numFiles=0, bucketing_version=2, comment=Imported by sqoop on 2020/12/15 19:53:38}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE, privileges:PrincipalPrivilegeSet(userPrivileges:{hadoopuser=[PrivilegeGrantInfo(privilege:INSERT, createTime:-1, grantor:hadoopuser, grantorType:USER, grantOption:true), PrivilegeGrantInfo(privilege:SELECT, createTime:-1, grantor:hadoopuser, grantorType:USER, grantOption:true), PrivilegeGrantInfo(privilege:UPDATE, createTime:-1, grantor:hadoopuser, grantorType:USER, grantOption:true), PrivilegeGrantInfo(privilege:DELETE, createTime:-1, grantor:hadoopuser, grantorType:USER, grantOption:true)]}, groupPrivileges:null, rolePrivileges:null), temporary:false, catName:hive, ownerType:USER)
2020-12-15 19:53:49,030 INFO hive.HiveImport: 2020-12-15 19:53:49,030 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] utils.FileUtils (FileUtils.java:mkdir(167)) - Creating directory if it doesn't exist: hdfs://hadoop1:9000/user/hive/warehouse/test.db/departments
2020-12-15 19:53:49,148 INFO hive.HiveImport: 2020-12-15 19:53:49,148 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:execute(2531)) - Completed executing command(queryId=hadoopuser_20201215195347_108c2a9a-c30a-4bb3-96f7-c04465046115); Time taken: 0.431 seconds
2020-12-15 19:53:49,148 INFO hive.HiveImport: 2020-12-15 19:53:49,148 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (SessionState.java:printInfo(1227)) - OK
2020-12-15 19:53:49,148 INFO hive.HiveImport: 2020-12-15 19:53:49,148 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:checkConcurrency(285)) - Concurrency mode is disabled, not creating a lock manager
2020-12-15 19:53:49,148 INFO hive.HiveImport: OK
2020-12-15 19:53:49,149 INFO hive.HiveImport: Time taken: 1.178 seconds
2020-12-15 19:53:49,149 INFO hive.HiveImport: 2020-12-15 19:53:49,149 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] CliDriver (SessionState.java:printInfo(1227)) - Time taken: 1.178 seconds
2020-12-15 19:53:49,149 INFO hive.HiveImport: 2020-12-15 19:53:49,149 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: 63eed2d2-8c51-47fc-acb7-6f22c42aa3e4
2020-12-15 19:53:49,149 INFO hive.HiveImport: 2020-12-15 19:53:49,149 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] session.SessionState (SessionState.java:resetThreadName(452)) - Resetting thread name to  main
2020-12-15 19:53:49,149 INFO hive.HiveImport: 2020-12-15 19:53:49,149 INFO  [main] conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: 63eed2d2-8c51-47fc-acb7-6f22c42aa3e4
2020-12-15 19:53:49,150 INFO hive.HiveImport: 2020-12-15 19:53:49,149 INFO  [main] session.SessionState (SessionState.java:updateThreadName(441)) - Updating thread name to 63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main
2020-12-15 19:53:49,151 INFO hive.HiveImport: 2020-12-15 19:53:49,151 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:compile(554)) - Compiling command(queryId=hadoopuser_20201215195349_3c1eb187-a7af-4614-80d4-435c176f1cdb):
2020-12-15 19:53:49,152 INFO hive.HiveImport: LOAD DATA INPATH 'hdfs://hadoop1:9000/user/hadoopuser/departments' INTO TABLE `test.departments`
2020-12-15 19:53:49,166 INFO hive.HiveImport: 2020-12-15 19:53:49,165 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStoreClient (HiveMetaStoreClient.java:isCompatibleWith(346)) - Mestastore configuration metastore.filter.hook changed from org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl to org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook
2020-12-15 19:53:49,166 INFO hive.HiveImport: 2020-12-15 19:53:49,166 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: Cleaning up thread local RawStore...
2020-12-15 19:53:49,167 INFO hive.HiveImport: 2020-12-15 19:53:49,167 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=Cleaning up thread local RawStore...
2020-12-15 19:53:49,167 INFO hive.HiveImport: 2020-12-15 19:53:49,167 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: Done cleaning up thread local RawStore
2020-12-15 19:53:49,167 INFO hive.HiveImport: 2020-12-15 19:53:49,167 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=Done cleaning up thread local RawStore
2020-12-15 19:53:49,168 INFO hive.HiveImport: 2020-12-15 19:53:49,168 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:checkConcurrency(285)) - Concurrency mode is disabled, not creating a lock manager
2020-12-15 19:53:49,174 INFO hive.HiveImport: 2020-12-15 19:53:49,173 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:newRawStoreForConf(717)) - 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2020-12-15 19:53:49,174 INFO hive.HiveImport: 2020-12-15 19:53:49,174 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:correctAutoStartMechanism(638)) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2020-12-15 19:53:49,174 INFO hive.HiveImport: 2020-12-15 19:53:49,174 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:initializeHelper(481)) - ObjectStore, initialize called
2020-12-15 19:53:49,177 INFO hive.HiveImport: 2020-12-15 19:53:49,177 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(186)) - Using direct SQL, underlying DB is MYSQL
2020-12-15 19:53:49,177 INFO hive.HiveImport: 2020-12-15 19:53:49,177 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:setConf(396)) - Initialized ObjectStore
2020-12-15 19:53:49,178 INFO hive.HiveImport: 2020-12-15 19:53:49,178 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.RetryingMetaStoreClient (RetryingMetaStoreClient.java:<init>(97)) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hadoopuser (auth:SIMPLE) retries=1 delay=1 lifetime=0
2020-12-15 19:53:49,180 INFO hive.HiveImport: 2020-12-15 19:53:49,179 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: get_table : tbl=hive.test.departments
2020-12-15 19:53:49,180 INFO hive.HiveImport: 2020-12-15 19:53:49,180 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=get_table : tbl=hive.test.departments
2020-12-15 19:53:49,310 INFO hive.HiveImport: 2020-12-15 19:53:49,310 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] sasl.SaslDataTransferClient (SaslDataTransferClient.java:checkTrustAndSend(239)) - SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-12-15 19:53:49,413 INFO hive.HiveImport: 2020-12-15 19:53:49,413 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:compile(666)) - Semantic Analysis Completed (retrial = false)
2020-12-15 19:53:49,413 INFO hive.HiveImport: 2020-12-15 19:53:49,413 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:getSchema(374)) - Returning Hive schema: Schema(fieldSchemas:null, properties:null)
2020-12-15 19:53:49,413 INFO hive.HiveImport: 2020-12-15 19:53:49,413 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:compile(781)) - Completed compiling command(queryId=hadoopuser_20201215195349_3c1eb187-a7af-4614-80d4-435c176f1cdb); Time taken: 0.262 seconds
2020-12-15 19:53:49,413 INFO hive.HiveImport: 2020-12-15 19:53:49,413 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] reexec.ReExecDriver (ReExecDriver.java:run(156)) - Execution #1 of query
2020-12-15 19:53:49,413 INFO hive.HiveImport: 2020-12-15 19:53:49,413 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:checkConcurrency(285)) - Concurrency mode is disabled, not creating a lock manager
2020-12-15 19:53:49,414 INFO hive.HiveImport: 2020-12-15 19:53:49,413 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:execute(2255)) - Executing command(queryId=hadoopuser_20201215195349_3c1eb187-a7af-4614-80d4-435c176f1cdb):
2020-12-15 19:53:49,414 INFO hive.HiveImport: LOAD DATA INPATH 'hdfs://hadoop1:9000/user/hadoopuser/departments' INTO TABLE `test.departments`
2020-12-15 19:53:49,414 INFO hive.HiveImport: 2020-12-15 19:53:49,414 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:launchTask(2662)) - Starting task [Stage-0:MOVE] in serial mode
2020-12-15 19:53:49,415 INFO hive.HiveImport: 2020-12-15 19:53:49,414 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: Cleaning up thread local RawStore...
2020-12-15 19:53:49,415 INFO hive.HiveImport: 2020-12-15 19:53:49,415 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=Cleaning up thread local RawStore...
2020-12-15 19:53:49,415 INFO hive.HiveImport: Loading data to table test.departments
2020-12-15 19:53:49,415 INFO hive.HiveImport: 2020-12-15 19:53:49,415 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: Done cleaning up thread local RawStore
2020-12-15 19:53:49,416 INFO hive.HiveImport: 2020-12-15 19:53:49,415 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=Done cleaning up thread local RawStore
2020-12-15 19:53:49,416 INFO hive.HiveImport: 2020-12-15 19:53:49,415 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] exec.Task (SessionState.java:printInfo(1227)) - Loading data to table test.departments from hdfs://hadoop1:9000/user/hadoopuser/departments
2020-12-15 19:53:49,418 INFO hive.HiveImport: 2020-12-15 19:53:49,418 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:newRawStoreForConf(717)) - 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2020-12-15 19:53:49,418 INFO hive.HiveImport: 2020-12-15 19:53:49,418 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:correctAutoStartMechanism(638)) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2020-12-15 19:53:49,418 INFO hive.HiveImport: 2020-12-15 19:53:49,418 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:initializeHelper(481)) - ObjectStore, initialize called
2020-12-15 19:53:49,423 INFO hive.HiveImport: 2020-12-15 19:53:49,423 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(186)) - Using direct SQL, underlying DB is MYSQL
2020-12-15 19:53:49,423 INFO hive.HiveImport: 2020-12-15 19:53:49,423 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:setConf(396)) - Initialized ObjectStore
2020-12-15 19:53:49,424 INFO hive.HiveImport: 2020-12-15 19:53:49,423 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.RetryingMetaStoreClient (RetryingMetaStoreClient.java:<init>(97)) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hadoopuser (auth:SIMPLE) retries=1 delay=1 lifetime=0
2020-12-15 19:53:49,424 INFO hive.HiveImport: 2020-12-15 19:53:49,424 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: get_table : tbl=hive.test.departments
2020-12-15 19:53:49,425 INFO hive.HiveImport: 2020-12-15 19:53:49,424 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=get_table : tbl=hive.test.departments
2020-12-15 19:53:49,452 INFO hive.HiveImport: 2020-12-15 19:53:49,452 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: get_table : tbl=hive.test.departments
2020-12-15 19:53:49,462 INFO hive.HiveImport: 2020-12-15 19:53:49,452 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=get_table : tbl=hive.test.departments
2020-12-15 19:53:49,509 INFO hive.HiveImport: 2020-12-15 19:53:49,508 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: alter_table: hive.test.departments newtbl=departments
2020-12-15 19:53:49,509 INFO hive.HiveImport: 2020-12-15 19:53:49,509 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=alter_table: hive.test.departments newtbl=departments
2020-12-15 19:53:49,566 INFO hive.HiveImport: 2020-12-15 19:53:49,566 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:launchTask(2662)) - Starting task [Stage-1:STATS] in serial mode
2020-12-15 19:53:49,566 INFO hive.HiveImport: 2020-12-15 19:53:49,566 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: Cleaning up thread local RawStore...
2020-12-15 19:53:49,567 INFO hive.HiveImport: 2020-12-15 19:53:49,567 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=Cleaning up thread local RawStore...
2020-12-15 19:53:49,567 INFO hive.HiveImport: 2020-12-15 19:53:49,567 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: Done cleaning up thread local RawStore
2020-12-15 19:53:49,568 INFO hive.HiveImport: 2020-12-15 19:53:49,567 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=Done cleaning up thread local RawStore
2020-12-15 19:53:49,568 INFO hive.HiveImport: 2020-12-15 19:53:49,568 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] stats.BasicStatsTask (BasicStatsTask.java:process(96)) - Executing stats task
2020-12-15 19:53:49,570 INFO hive.HiveImport: 2020-12-15 19:53:49,570 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:newRawStoreForConf(717)) - 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2020-12-15 19:53:49,570 INFO hive.HiveImport: 2020-12-15 19:53:49,570 WARN  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:correctAutoStartMechanism(638)) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2020-12-15 19:53:49,571 INFO hive.HiveImport: 2020-12-15 19:53:49,570 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:initializeHelper(481)) - ObjectStore, initialize called
2020-12-15 19:53:49,574 INFO hive.HiveImport: 2020-12-15 19:53:49,574 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(186)) - Using direct SQL, underlying DB is MYSQL
2020-12-15 19:53:49,574 INFO hive.HiveImport: 2020-12-15 19:53:49,574 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.ObjectStore (ObjectStore.java:setConf(396)) - Initialized ObjectStore
2020-12-15 19:53:49,575 INFO hive.HiveImport: 2020-12-15 19:53:49,575 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.RetryingMetaStoreClient (RetryingMetaStoreClient.java:<init>(97)) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hadoopuser (auth:SIMPLE) retries=1 delay=1 lifetime=0
2020-12-15 19:53:49,575 INFO hive.HiveImport: 2020-12-15 19:53:49,575 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: get_table : tbl=hive.test.departments
2020-12-15 19:53:49,576 INFO hive.HiveImport: 2020-12-15 19:53:49,575 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=get_table : tbl=hive.test.departments
2020-12-15 19:53:49,599 INFO hive.HiveImport: 2020-12-15 19:53:49,598 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: alter_table: hive.test.departments newtbl=departments
2020-12-15 19:53:49,599 INFO hive.HiveImport: 2020-12-15 19:53:49,599 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser  ip=unknown-ip-addr      cmd=alter_table: hive.test.departments newtbl=departments
2020-12-15 19:53:49,631 INFO hive.HiveImport: 2020-12-15 19:53:49,630 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] stats.BasicStatsTask (BasicStatsTask.java:aggregateStats(280)) - Table test.departments stats: [numFiles=1, numRows=0, totalSize=153, rawDataSize=0]
2020-12-15 19:53:49,631 INFO hive.HiveImport: 2020-12-15 19:53:49,631 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:execute(2531)) - Completed executing command(queryId=hadoopuser_20201215195349_3c1eb187-a7af-4614-80d4-435c176f1cdb); Time taken: 0.218 seconds
2020-12-15 19:53:49,632 INFO hive.HiveImport: OK
2020-12-15 19:53:49,632 INFO hive.HiveImport: 2020-12-15 19:53:49,632 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (SessionState.java:printInfo(1227)) - OK
2020-12-15 19:53:49,632 INFO hive.HiveImport: 2020-12-15 19:53:49,632 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] ql.Driver (Driver.java:checkConcurrency(285)) - Concurrency mode is disabled, not creating a lock manager
2020-12-15 19:53:49,632 INFO hive.HiveImport: Time taken: 0.481 seconds
2020-12-15 19:53:49,632 INFO hive.HiveImport: 2020-12-15 19:53:49,632 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] CliDriver (SessionState.java:printInfo(1227)) - Time taken: 0.481 seconds
2020-12-15 19:53:49,633 INFO hive.HiveImport: 2020-12-15 19:53:49,633 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: 63eed2d2-8c51-47fc-acb7-6f22c42aa3e4
2020-12-15 19:53:49,633 INFO hive.HiveImport: 2020-12-15 19:53:49,633 INFO  [63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 main] session.SessionState (SessionState.java:resetThreadName(452)) - Resetting thread name to  main
2020-12-15 19:53:49,633 INFO hive.HiveImport: 2020-12-15 19:53:49,633 INFO  [main] conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: 63eed2d2-8c51-47fc-acb7-6f22c42aa3e4
2020-12-15 19:53:49,643 INFO hive.HiveImport: 2020-12-15 19:53:49,643 INFO  [main] session.SessionState (SessionState.java:dropPathAndUnregisterDeleteOnExit(885)) - Deleted directory: /tmp/hive/hadoopuser/63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 on fs with scheme hdfs
2020-12-15 19:53:49,648 INFO hive.HiveImport: 2020-12-15 19:53:49,648 INFO  [main] session.SessionState (SessionState.java:dropPathAndUnregisterDeleteOnExit(885)) - Deleted directory: /tmp/hadoopuser/63eed2d2-8c51-47fc-acb7-6f22c42aa3e4 on fs with scheme file
2020-12-15 19:53:49,653 INFO hive.HiveImport: 2020-12-15 19:53:49,653 INFO  [main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: Cleaning up thread local RawStore...
2020-12-15 19:53:49,654 INFO hive.HiveImport: 2020-12-15 19:53:49,654 INFO  [main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser ip=unknown-ip-addr      cmd=Cleaning up thread local RawStore...
2020-12-15 19:53:49,654 INFO hive.HiveImport: 2020-12-15 19:53:49,654 INFO  [main] metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 0: Done cleaning up thread local RawStore
2020-12-15 19:53:49,654 INFO hive.HiveImport: 2020-12-15 19:53:49,654 INFO  [main] HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=hadoopuser ip=unknown-ip-addr      cmd=Done cleaning up thread local RawStore
2020-12-15 19:53:50,005 INFO hive.HiveImport: Hive import complete.
2020-12-15 19:53:50,011 INFO hive.HiveImport: Export directory is contains the _SUCCESS file only, removing the directory.

 

뭐가 무진장 많이 실행됐다. 근데 마지막에 하이브 임포트 생성됐다니까 하이브로 확인하러 가보자.

$ hive
hive> show databases;
default
test

hive> show tables;
departments
tbl

hive> select * from departments;
d009,Customer Service   NULL
d005,Development        NULL
d002,Finance    NULL
d003,Human Resources    NULL
d001,Marketing  NULL
d004,Production NULL
d006,Quality Management NULL
d008,Research   NULL
d007,Sales      NULL

부서 데이터가 잘 들어간 것을 확인할 수 있다.

 

하이고 이렇게 간단한것을 ! 이제 밥먹어야겠다. 끝.