最新文章專題視頻專題問(wèn)答1問(wèn)答10問(wèn)答100問(wèn)答1000問(wèn)答2000關(guān)鍵字專題1關(guān)鍵字專題50關(guān)鍵字專題500關(guān)鍵字專題1500TAG最新視頻文章推薦1 推薦3 推薦5 推薦7 推薦9 推薦11 推薦13 推薦15 推薦17 推薦19 推薦21 推薦23 推薦25 推薦27 推薦29 推薦31 推薦33 推薦35 推薦37視頻文章20視頻文章30視頻文章40視頻文章50視頻文章60 視頻文章70視頻文章80視頻文章90視頻文章100視頻文章120視頻文章140 視頻2關(guān)鍵字專題關(guān)鍵字專題tag2tag3文章專題文章專題2文章索引1文章索引2文章索引3文章索引4文章索引5123456789101112131415文章專題3
問(wèn)答文章1 問(wèn)答文章501 問(wèn)答文章1001 問(wèn)答文章1501 問(wèn)答文章2001 問(wèn)答文章2501 問(wèn)答文章3001 問(wèn)答文章3501 問(wèn)答文章4001 問(wèn)答文章4501 問(wèn)答文章5001 問(wèn)答文章5501 問(wèn)答文章6001 問(wèn)答文章6501 問(wèn)答文章7001 問(wèn)答文章7501 問(wèn)答文章8001 問(wèn)答文章8501 問(wèn)答文章9001 問(wèn)答文章9501
當(dāng)前位置: 首頁(yè) - 科技 - 知識(shí)百科 - 正文

Alex的Hadoop菜鳥(niǎo)教程:第8課Sqoop1導(dǎo)入Hbase以及Hive

來(lái)源:懂視網(wǎng) 責(zé)編:小采 時(shí)間:2020-11-09 14:14:23
文檔

Alex的Hadoop菜鳥(niǎo)教程:第8課Sqoop1導(dǎo)入Hbase以及Hive

Alex的Hadoop菜鳥(niǎo)教程:第8課Sqoop1導(dǎo)入Hbase以及Hive:繼續(xù)寫,其實(shí)mysql 導(dǎo)入導(dǎo)出 hdfs 對(duì)于實(shí)際項(xiàng)目開(kāi)發(fā)沒(méi)啥用的,但是那個(gè)可以拿來(lái)入門。今天寫跟Hbase和Hive的協(xié)作。我突然發(fā)現(xiàn)我的教程寫的順序很凌亂啊,沒(méi)有先介紹Hive 的安裝,這點(diǎn)向大家道歉,我后面補(bǔ)上。 數(shù)據(jù)準(zhǔn)備 mysql 在mysql 里面建立表 e
推薦度:
導(dǎo)讀Alex的Hadoop菜鳥(niǎo)教程:第8課Sqoop1導(dǎo)入Hbase以及Hive:繼續(xù)寫,其實(shí)mysql 導(dǎo)入導(dǎo)出 hdfs 對(duì)于實(shí)際項(xiàng)目開(kāi)發(fā)沒(méi)啥用的,但是那個(gè)可以拿來(lái)入門。今天寫跟Hbase和Hive的協(xié)作。我突然發(fā)現(xiàn)我的教程寫的順序很凌亂啊,沒(méi)有先介紹Hive 的安裝,這點(diǎn)向大家道歉,我后面補(bǔ)上。 數(shù)據(jù)準(zhǔn)備 mysql 在mysql 里面建立表 e

繼續(xù)寫,其實(shí)mysql 導(dǎo)入導(dǎo)出 hdfs 對(duì)于實(shí)際項(xiàng)目開(kāi)發(fā)沒(méi)啥用的,但是那個(gè)可以拿來(lái)入門。今天寫跟Hbase和Hive的協(xié)作。我突然發(fā)現(xiàn)我的教程寫的順序很凌亂啊,沒(méi)有先介紹Hive 的安裝,這點(diǎn)向大家道歉,我后面補(bǔ)上。 數(shù)據(jù)準(zhǔn)備 mysql 在mysql 里面建立表 employee

繼續(xù)寫,其實(shí)mysql 導(dǎo)入導(dǎo)出 hdfs 對(duì)于實(shí)際項(xiàng)目開(kāi)發(fā)沒(méi)啥用的,但是那個(gè)可以拿來(lái)入門。今天寫跟Hbase和Hive的協(xié)作。我突然發(fā)現(xiàn)我的教程寫的順序很凌亂啊,沒(méi)有先介紹Hive 的安裝,這點(diǎn)向大家道歉,我后面補(bǔ)上。

數(shù)據(jù)準(zhǔn)備

mysql

在mysql 里面建立表 employee 并插入數(shù)據(jù)
CREATE TABLE `employee` ( 
 `id` int(11) NOT NULL, 
 `name` varchar(20) NOT NULL, 
 PRIMARY KEY (`id`) 
) ENGINE=MyISAM DEFAULT CHARSET=utf8; 

insert into employee (id,name) values (1,'michael'); 
insert into employee (id,name) values (2,'ted'); 
insert into employee (id,name) values (3,'jack'); 

Hbase

hbase(main):006:0> create 'employee','info'
0 row(s) in 0.4440 seconds

=> Hbase::Table - employee

Hive

不需要數(shù)據(jù)準(zhǔn)備,等等用--create-hive-table會(huì)自動(dòng)建表

從mysql導(dǎo)入到Hbase

# sqoop import --connect jdbc:mysql://localhost:3306/sqoop_test --username root --password root --table employee --hbase-table employee --column-family info --hbase-row-key id -m 1
Warning: /usr/lib/sqoop/../hive-hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
14/12/01 17:36:25 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.1
14/12/01 17:36:25 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
14/12/01 17:36:25 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
14/12/01 17:36:25 INFO tool.CodeGenTool: Beginning code generation
14/12/01 17:36:26 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 1
14/12/01 17:36:26 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 1
14/12/01 17:36:26 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce
……中間日志太多了,用省略號(hào)代替
14/12/01 17:37:12 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 37.3924 seconds (0 bytes/sec)
14/12/01 17:37:12 INFO mapreduce.ImportJobBase: Retrieved 3 records.


去檢查下hbase
hbase(main):001:0> scan 'employee'
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
ROW COLUMN+CELL 
 1 column=info:name, timestamp=1417426628685, value=michael 
 2 column=info:name, timestamp=1417426628685, value=ted 
 3 column=info:name, timestamp=1417426628685, value=jack 
3 row(s) in 0.1630 seconds

成功插入3條數(shù)據(jù)

從mysql導(dǎo)入hive

# sqoop import --connect jdbc:mysql://localhost:3306/sqoop_test --username root --password root --table employee --hive-import --hive-table hive_employee --create-hive-table
Warning: /usr/lib/sqoop/../hive-hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
……………………
14/12/02 15:12:13 INFO hive.HiveImport: Loading data to table default.hive_employee
14/12/02 15:12:14 INFO hive.HiveImport: Table default.hive_employee stats: [num_partitions: 0, num_files: 4, num_rows: 0, total_size: 23, raw_data_size: 0]
14/12/02 15:12:14 INFO hive.HiveImport: OK
14/12/02 15:12:14 INFO hive.HiveImport: Time taken: 0.799 seconds
14/12/02 15:12:14 INFO hive.HiveImport: Hive import complete.
14/12/02 15:12:14 INFO hive.HiveImport: Export directory is empty, removing it.

這里說(shuō)下真實(shí)環(huán)境中mysql的jdbc鏈接不要用localhost,因?yàn)檫@個(gè)任務(wù)會(huì)被分布式的發(fā)送不同的hadoop機(jī)子上,要那些機(jī)子真的可以通過(guò)jdbc連到mysql上才行,否則會(huì)丟數(shù)據(jù)
檢查下hive
hive> select * from hive_employee;
OK
1	michael
2	ted
3	jack
Time taken: 0.179 seconds, Fetched: 3 row(s)

還有一點(diǎn)要聲明下:目前sqoop只能從mysql導(dǎo)入數(shù)據(jù)到hive的原生表(也就是基于hdfs存儲(chǔ)的),無(wú)法導(dǎo)入數(shù)據(jù)到外部表(比如基于hbase建立的hive表)
下課!下次講導(dǎo)出!

聲明:本網(wǎng)頁(yè)內(nèi)容旨在傳播知識(shí),若有侵權(quán)等問(wèn)題請(qǐng)及時(shí)與本網(wǎng)聯(lián)系,我們將在第一時(shí)間刪除處理。TEL:177 7030 7066 E-MAIL:11247931@qq.com

文檔

Alex的Hadoop菜鳥(niǎo)教程:第8課Sqoop1導(dǎo)入Hbase以及Hive

Alex的Hadoop菜鳥(niǎo)教程:第8課Sqoop1導(dǎo)入Hbase以及Hive:繼續(xù)寫,其實(shí)mysql 導(dǎo)入導(dǎo)出 hdfs 對(duì)于實(shí)際項(xiàng)目開(kāi)發(fā)沒(méi)啥用的,但是那個(gè)可以拿來(lái)入門。今天寫跟Hbase和Hive的協(xié)作。我突然發(fā)現(xiàn)我的教程寫的順序很凌亂啊,沒(méi)有先介紹Hive 的安裝,這點(diǎn)向大家道歉,我后面補(bǔ)上。 數(shù)據(jù)準(zhǔn)備 mysql 在mysql 里面建立表 e
推薦度:
標(biāo)簽: 導(dǎo)入 教程 的教程
  • 熱門焦點(diǎn)

最新推薦

猜你喜歡

熱門推薦

專題
Top