Hbase数据导出实例_hbase shell 导出查询-程序员宅基地

技术标签: 大数据-hadoop  hbase  

 

      1. Hbase数据导出实例

需求:

根据时间范围、区域等条件查询,将hbase中终端采集数据最大时间、最小时间的日志数据导出

参考文档:

http://blog.csdn.net/qq_27078095/article/details/56482010

https://www.cnblogs.com/szw-blog/p/5512349.html

http://blog.csdn.net/javajxz008/article/details/61173213

http://www.cnblogs.com/husky/p/6422001.html

https://my.oschina.net/wangjiankui/blog/497658

https://www.2cto.com/net/201708/673854.html

http://www.cnblogs.com/husky/p/6764317.html

 

解决办法:

  1. 通过hbase自带导入导出将查询到的终端mac数据导出到指定目录

hbase org.apache.hadoop.hbase.mapreduce.Driver export 表名称    目录

这样导出是整个表数据,没法过滤,姑且暂时不能使用。

  1. 通过查询条件过滤导出

scan 'LOG20180108',{COLUMNS => 'INFO',LIMIT=>1,FILTER=>"(PrefixFilter('T')) AND (SingleColumnValueFilter('INFO','AreaCode',=,'binary:610103'))"}

导出到文件:

echo "scan 'LOG20180108',{COLUMNS => ['INFO'],LIMIT=>1,FILTER=>\"(PrefixFilter('T')) AND (SingleColumnValueFilter('INFO','AreaCode',=,'binary:610103'))\"}" | hbase shell > myText.csv

 

  1. 通过hive导出
  1. 将hbase表与hive临时表同步。
  2. 将hive临时表数据导入到真实表
  3. 将真实表数据导入数据库

脚本如下:

#!/bin/bash

 

#获取输入时间,默认为当前时间

get_time(){

    if [ 3 -eq $# ]

    then

        date=`date -d "+0 day $1" +%Y%m%d`

        enddate=`date -d "+1 day $2" +%Y%m%d`      

    elif [ 0 -eq $# ]

    then

        echo "无输入参数,则默认构建昨天."

        echo "若是批量构建,请输入时间段,格式为【$0 yyyy-mm-dd yyyy-mm-dd】."

        #read -p "若不输入参数则默认构建昨天数据,输入【y】继续构建昨日数据,输入【n】退出:" isBuild

        #case $isBuild in

        #y | Y)

            date=`date -d "+0 day yesterday" +%Y%m%d`

            enddate=`date +%Y%m%d`

        #   ;;

        #n | N)

        #   exit

        #   ;;

        #*)

        #   echo "输入错误,退出"

        #   exit

        #   ;;

        #esac

    else

        echo "输入有误."

        echo "若是批量构建,请输入时间段,格式为【$0 yyyy-mm-dd yyyy-mm-dd】."

        echo "若默认构建昨天数据,则不需要输入参数,直接执行【$0】."

    fi

}

 

 

 

#创建存储数据表结构

hive_table(){

    echo "create hive table start......."

    hive -S -e "DROP TABLE IF EXISTS LogTerminal;

    CREATE TABLE LogTerminal(rowkey string,TerminalID string,TerminalMac string,DeviceType string, Power string,Channel string,MaxPower string,TimeNear string,LonNear string,LatNear string,RouteMac string,SSID string, SSIDs string,SecurityType string,RealType string, RealCode string,TerminalType int,PcBrand string, Phone string,IMSI string,IMEI string,OS string,CustomerStartTime string,OffLineTime string, Model string,CoordinateX string,CoordinateY string, OffLineLon string,OffLineLat string,PcIP string, RouteType string,SessionID string,ProtoType string,CyberCode string,IsMove string,IsElectric string, SafeState string,GPIOState string,Serial string, GuildID string,Time string,ManufacturerCode string,AreaCode string,UnitCode string,MachineCode string, SystemType string,DATASOURCEID string,Lon string, Lat string,LatLon string,RESOURCETYPE string, AccessSystemID string,InterfaceID string,InterfaceGroupID string,WriterTime string )COMMENT 'LogTerminal Table' ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS PARQUET; "

    hive -S -e "DROP TABLE IF EXISTS LogTerminal_min;

    CREATE TABLE LogTerminal_min(rowkey string,TerminalID string,TerminalMac string,DeviceType string, Power string,Channel string,MaxPower string,TimeNear string,LonNear string,LatNear string,RouteMac string,SSID string, SSIDs string,SecurityType string,RealType string, RealCode string,TerminalType int,PcBrand string, Phone string,IMSI string,IMEI string,OS string,CustomerStartTime string,OffLineTime string, Model string,CoordinateX string,CoordinateY string, OffLineLon string,OffLineLat string,PcIP string, RouteType string,SessionID string,ProtoType string,CyberCode string,IsMove string,IsElectric string, SafeState string,GPIOState string,Serial string, GuildID string,Time string,ManufacturerCode string,AreaCode string,UnitCode string,MachineCode string, SystemType string,DATASOURCEID string,Lon string, Lat string,LatLon string,RESOURCETYPE string, AccessSystemID string,InterfaceID string,InterfaceGroupID string,WriterTime string )COMMENT 'LogTerminal Table' ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS PARQUET; "

    hive -S -e "DROP TABLE IF EXISTS LogTerminal_max;

    CREATE TABLE LogTerminal_max(rowkey string,TerminalID string,TerminalMac string,DeviceType string, Power string,Channel string,MaxPower string,TimeNear string,LonNear string,LatNear string,RouteMac string,SSID string, SSIDs string,SecurityType string,RealType string, RealCode string,TerminalType int,PcBrand string, Phone string,IMSI string,IMEI string,OS string,CustomerStartTime string,OffLineTime string, Model string,CoordinateX string,CoordinateY string, OffLineLon string,OffLineLat string,PcIP string, RouteType string,SessionID string,ProtoType string,CyberCode string,IsMove string,IsElectric string, SafeState string,GPIOState string,Serial string, GuildID string,Time string,ManufacturerCode string,AreaCode string,UnitCode string,MachineCode string, SystemType string,DATASOURCEID string,Lon string, Lat string,LatLon string,RESOURCETYPE string, AccessSystemID string,InterfaceID string,InterfaceGroupID string,WriterTime string )COMMENT 'LogTerminal Table' ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS PARQUET; "

    echo "create hive table end......."

}

 

#创建临时终端日志表结构,并将hbase表和hive临时表关联,然后根据条件查询临时表数据插入到真实表中

hive_task(){

    echo "hive task $1 $2 $3  ..."

    DATA_FORMAT=`date -d "$1" +%Y-%m-%d`

    TABLE_NAME=LOG$1

    AREA_CODE=$3

   

    echo $DATA_FORMAT

    echo $TABLE_NAME

    echo $AREA_CODE

   

    hive -S -e "DROP TABLE  IF EXISTS TempLogTerminal;

    CREATE EXTERNAL TABLE TempLogTerminal(key string,TerminalID string,TerminalMac string,DeviceType string, Power string,Channel string,MaxPower string,TimeNear string,LonNear string,LatNear string,RouteMac string,SSID string, SSIDs string,SecurityType string,RealType string, RealCode string,TerminalType int,PcBrand string, Phone string,IMSI string,IMEI string,OS string,CustomerStartTime string,OffLineTime string, Model string,CoordinateX string,CoordinateY string, OffLineLon string,OffLineLat string,PcIP string, RouteType string,SessionID string,ProtoType string,CyberCode string,IsMove string,IsElectric string, SafeState string,GPIOState string,Serial string, GuildID string,Time string,ManufacturerCode string,AreaCode string,UnitCode string,MachineCode string,SystemType string,DATASOURCEID string,Lon string, Lat string,LatLon string,RESOURCETYPE string, AccessSystemID string,InterfaceID string,InterfaceGroupID string,WriterTime string)  ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES('hbase.columns.mapping'=':key,INFO:TerminalID,INFO:TerminalMac,INFO:DeviceType,INFO:Power,INFO:Channel,INFO:MaxPower,INFO:TimeNear,INFO:LonNear,INFO:LatNear,INFO:RouteMac,INFO:SSID,INFO:SSIDs,INFO:SecurityType,INFO:RealType,INFO:RealCode,INFO:TerminalType,INFO:PcBrand,INFO:Phone,INFO:IMSI,INFO:IMEI,INFO:OS,INFO:CustomerStartTime,INFO:OffLineTime,INFO:Model,INFO:CoordinateX,INFO:CoordinateY,INFO:OffLineLon,INFO:OffLineLat,INFO:PcIP,INFO:RouteType,INFO:SessionID,INFO:ProtoType,INFO:CyberCode,INFO:IsMove,INFO:IsElectric,INFO:SafeState,INFO:GPIOState,INFO:Serial,INFO:GuildID,INFO:Time,INFO:ManufacturerCode,INFO:AreaCode,INFO:UnitCode,INFO:MachineCode,INFO:SystemType,INFO:DATASOURCEID,INFO:Lon,INFO:Lat,INFO:LatLon,INFO:RESOURCETYPE,INFO:AccessSystemID,INFO:InterfaceID,INFO:InterfaceGroupID,INFO:WriterTime') TBLPROPERTIES('hbase.table.name'='$TABLE_NAME') ;

    INSERT $2 TABLE logterminal SELECT key as rowkey,TerminalID,TerminalMac,DeviceType,Power,Channel,MaxPower,TimeNear,LonNear,LatNear,RouteMac,SSID,SSIDs,SecurityType,RealType,RealCode,TerminalType,PcBrand, Phone,IMSI,IMEI,OS,CustomerStartTime,OffLineTime,Model,CoordinateX,CoordinateY,OffLineLon,OffLineLat,PcIP,RouteType,SessionID,ProtoType,CyberCode,IsMove,IsElectric, SafeState,GPIOState,Serial,GuildID,Time,ManufacturerCode,AreaCode,UnitCode,MachineCode,SystemType,DATASOURCEID,Lon,Lat,LatLon,RESOURCETYPE,AccessSystemID,InterfaceID,InterfaceGroupID,WriterTime

    FROM TempLogTerminal WHERE RESOURCETYPE=32 AND  AreaCode='$AREA_CODE';"

   

    echo "hive task end......."

}

 

#创建sql表

sqlserver_table(){

"

if exists (select * from sysobjects where name='LogTerminal_min')

drop table LogTerminal_min

 

CREATE  TABLE LogTerminal_min(

    rowkey nvarchar(200),TerminalID bigint,TerminalMac nvarchar(200),DeviceType bigint,

    Power bigint,Channel bigint,MaxPower bigint,

    TimeNear nvarchar(200),LonNear float,LatNear float,

    RouteMac nvarchar(200),SSID nvarchar(200),

    SSIDs nvarchar(200),SecurityType nvarchar(200),RealType bigint,

    RealCode nvarchar(200),TerminalType bigint,PcBrand nvarchar(200),

    Phone nvarchar(200),IMSI nvarchar(200),IMEI nvarchar(200),

    OS nvarchar(200),CustomerStartTime nvarchar(200),OffLineTime nvarchar(200),

    Model nvarchar(200),CoordinateX nvarchar(200),CoordinateY nvarchar(200),

    OffLineLon float,OffLineLat float,PcIP nvarchar(200),

    RouteType bigint,SessionID nvarchar(200),ProtoType bigint,

    CyberCode nvarchar(200),IsMove bigint,IsElectric bigint,

    SafeState bigint,GPIOState bigint,Serial nvarchar(200),

    GuildID nvarchar(200),Time nvarchar(200),ManufacturerCode nvarchar(200),

    AreaCode nvarchar(200),UnitCode nvarchar(200),MachineCode nvarchar(200),

    SystemType nvarchar(200),DATASOURCEID bigint,Lon float,

    Lat float,LatLon nvarchar(200),RESOURCETYPE bigint,

    AccessSystemID bigint,InterfaceID bigint,InterfaceGroupID bigint,

    WriterTime nvarchar(200)

);

 

if exists (select * from sysobjects where name='LogTerminal_max')

drop table LogTerminal_max

 

CREATE  TABLE LogTerminal_max(

    rowkey nvarchar(200),TerminalID bigint,TerminalMac nvarchar(200),DeviceType bigint,

    Power bigint,Channel bigint,MaxPower bigint,

    TimeNear nvarchar(200),LonNear float,LatNear float,

    RouteMac nvarchar(200),SSID nvarchar(200),

    SSIDs nvarchar(200),SecurityType nvarchar(200),RealType bigint,

    RealCode nvarchar(200),TerminalType bigint,PcBrand nvarchar(200),

    Phone nvarchar(200),IMSI nvarchar(200),IMEI nvarchar(200),

    OS nvarchar(200),CustomerStartTime nvarchar(200),OffLineTime nvarchar(200),

    Model nvarchar(200),CoordinateX nvarchar(200),CoordinateY nvarchar(200),

    OffLineLon float,OffLineLat float,PcIP nvarchar(200),

    RouteType bigint,SessionID nvarchar(200),ProtoType bigint,

    CyberCode nvarchar(200),IsMove bigint,IsElectric bigint,

    SafeState bigint,GPIOState bigint,Serial nvarchar(200),

    GuildID nvarchar(200),Time nvarchar(200),ManufacturerCode nvarchar(200),

    AreaCode nvarchar(200),UnitCode nvarchar(200),MachineCode nvarchar(200),

    SystemType nvarchar(200),DATASOURCEID bigint,Lon float,

    Lat float,LatLon nvarchar(200),RESOURCETYPE bigint,

    AccessSystemID bigint,InterfaceID bigint,InterfaceGroupID bigint,

    WriterTime nvarchar(200)

);

    "

}

 

#将数据导入到sqlserver中

import_sqlserver(){

    echo " import sqlserver start......."

    sqoop export -connect 'jdbc:sqlserver://192.168.2.219; username=nsmc53; password=123456;database=WFBDCMain'  -table LogTerminal_min --hcatalog-database default --hcatalog-table LogTerminal_min --num-mappers 5;

    sqoop export -connect 'jdbc:sqlserver://192.168.2.219; username=nsmc53; password=123456;database=WFBDCMain'  -table LogTerminal_max --hcatalog-database default --hcatalog-table LogTerminal_max --num-mappers 5;

   

    echo " import sqlserver end......."

}

 

#将hive数据导出到本地目录

export_hive_local(){

    echo " export hive to local start......."

    mkdir -p /home/hive/min

    mkdir -p /home/hive/max

    hive -S -e "\

    insert overwrite local directory '/home/hive/min'   ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' select a.* from LogTerminal a inner join(select TerminalMac,min(time) time from LogTerminal group by TerminalMac) b on a.TerminalMac = b.TerminalMac and a.time = b.time order by a.TerminalMac ;\

    insert overwrite local directory '/home/hive/max'   ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' select a.* from LogTerminal a inner join(select TerminalMac,max(time) time from LogTerminal group by TerminalMac) b on a.TerminalMac = b.TerminalMac and a.time = b.time order by a.TerminalMac ;"

   

#将hive数据导出到csv文件

    hive -e " set hive.cli.print.header=true;select a.* from LogTerminal a inner join(select TerminalMac,min(time) time from LogTerminal group by TerminalMac) b on a.TerminalMac = b.TerminalMac and a.time = b.time order by a.TerminalMac ;" >> /home/hive/logterminal-min.csv

   

    echo "export hive to local end......."

}

 

#将hive数据根据查询条件过滤导入到其他表

export_hive(){

    echo " export hive to sqlserver start......."

#查询时间最小mac

    hive -S -e "INSERT OVERWRITE TABLE LogTerminal_min select a.* from LogTerminal a inner join(select TerminalMac,min(time) time from LogTerminal group by TerminalMac) b on a.TerminalMac = b.TerminalMac and a.time = b.time order by a.TerminalMac ;";

#查询时间最大mac

    hive -S -e "INSERT OVERWRITE TABLE LogTerminal_max select a.* from LogTerminal a inner join(select TerminalMac,max(time) time from LogTerminal group by TerminalMac) b on a.TerminalMac = b.TerminalMac and a.time = b.time order by a.TerminalMac ;";

   

    echo "export hive to sqlserver end......."

}

 

main(){

#创建hive表LogTerminal

    echo " create hive table LogTerminal......."

    hive_table

   

#生成表结构时间范围

    get_time $*

    Style="OVERWRITE"

    date1=$date

    while [[ $date1 < $enddate ]]

    do

        echo "$date"

##创建临时终端日志表TempLogTerminal关联hbase表,重新运行时删除以前创建表结构,并将查询数据导入LogTerminal

        echo " exec  hive_task....... $date1 $Style $3 "

        hive_task $date1 $Style $3

        date1=`date -d "+1 day $date1" +%Y%m%d`

        Style="INTO"

    done

#从hive表LogTerminal日志表导出到本地目录/home/hive

    echo " export hive to sqlserver ......."

    export_hive

    echo " import sqlserver ......."

    import_sqlserver

    echo " query logterminal end......."

}

 

main $*

 

版权声明:本文为博主原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。
本文链接:https://blog.csdn.net/seashouwang/article/details/81699329

智能推荐

oracle 12c 集群安装后的检查_12c查看crs状态-程序员宅基地

文章浏览阅读1.6k次。安装配置gi、安装数据库软件、dbca建库见下:http://blog.csdn.net/kadwf123/article/details/784299611、检查集群节点及状态:[root@rac2 ~]# olsnodes -srac1 Activerac2 Activerac3 Activerac4 Active[root@rac2 ~]_12c查看crs状态

解决jupyter notebook无法找到虚拟环境的问题_jupyter没有pytorch环境-程序员宅基地

文章浏览阅读1.3w次,点赞45次,收藏99次。我个人用的是anaconda3的一个python集成环境,自带jupyter notebook,但在我打开jupyter notebook界面后,却找不到对应的虚拟环境,原来是jupyter notebook只是通用于下载anaconda时自带的环境,其他环境要想使用必须手动下载一些库:1.首先进入到自己创建的虚拟环境(pytorch是虚拟环境的名字)activate pytorch2.在该环境下下载这个库conda install ipykernelconda install nb__jupyter没有pytorch环境

国内安装scoop的保姆教程_scoop-cn-程序员宅基地

文章浏览阅读5.2k次,点赞19次,收藏28次。选择scoop纯属意外,也是无奈,因为电脑用户被锁了管理员权限,所有exe安装程序都无法安装,只可以用绿色软件,最后被我发现scoop,省去了到处下载XXX绿色版的烦恼,当然scoop里需要管理员权限的软件也跟我无缘了(譬如everything)。推荐添加dorado这个bucket镜像,里面很多中文软件,但是部分国外的软件下载地址在github,可能无法下载。以上两个是官方bucket的国内镜像,所有软件建议优先从这里下载。上面可以看到很多bucket以及软件数。如果官网登陆不了可以试一下以下方式。_scoop-cn

Element ui colorpicker在Vue中的使用_vue el-color-picker-程序员宅基地

文章浏览阅读4.5k次,点赞2次,收藏3次。首先要有一个color-picker组件 <el-color-picker v-model="headcolor"></el-color-picker>在data里面data() { return {headcolor: ’ #278add ’ //这里可以选择一个默认的颜色} }然后在你想要改变颜色的地方用v-bind绑定就好了,例如:这里的:sty..._vue el-color-picker

迅为iTOP-4412精英版之烧写内核移植后的镜像_exynos 4412 刷机-程序员宅基地

文章浏览阅读640次。基于芯片日益增长的问题,所以内核开发者们引入了新的方法,就是在内核中只保留函数,而数据则不包含,由用户(应用程序员)自己把数据按照规定的格式编写,并放在约定的地方,为了不占用过多的内存,还要求数据以根精简的方式编写。boot启动时,传参给内核,告诉内核设备树文件和kernel的位置,内核启动时根据地址去找到设备树文件,再利用专用的编译器去反编译dtb文件,将dtb还原成数据结构,以供驱动的函数去调用。firmware是三星的一个固件的设备信息,因为找不到固件,所以内核启动不成功。_exynos 4412 刷机

Linux系统配置jdk_linux配置jdk-程序员宅基地

文章浏览阅读2w次,点赞24次,收藏42次。Linux系统配置jdkLinux学习教程,Linux入门教程(超详细)_linux配置jdk

随便推点

matlab(4):特殊符号的输入_matlab微米怎么输入-程序员宅基地

文章浏览阅读3.3k次,点赞5次,收藏19次。xlabel('\delta');ylabel('AUC');具体符号的对照表参照下图:_matlab微米怎么输入

C语言程序设计-文件(打开与关闭、顺序、二进制读写)-程序员宅基地

文章浏览阅读119次。顺序读写指的是按照文件中数据的顺序进行读取或写入。对于文本文件,可以使用fgets、fputs、fscanf、fprintf等函数进行顺序读写。在C语言中,对文件的操作通常涉及文件的打开、读写以及关闭。文件的打开使用fopen函数,而关闭则使用fclose函数。在C语言中,可以使用fread和fwrite函数进行二进制读写。‍ Biaoge 于2024-03-09 23:51发布 阅读量:7 ️文章类型:【 C语言程序设计 】在C语言中,用于打开文件的函数是____,用于关闭文件的函数是____。

Touchdesigner自学笔记之三_touchdesigner怎么让一个模型跟着鼠标移动-程序员宅基地

文章浏览阅读3.4k次,点赞2次,收藏13次。跟随鼠标移动的粒子以grid(SOP)为partical(SOP)的资源模板,调整后连接【Geo组合+point spirit(MAT)】,在连接【feedback组合】适当调整。影响粒子动态的节点【metaball(SOP)+force(SOP)】添加mouse in(CHOP)鼠标位置到metaball的坐标,实现鼠标影响。..._touchdesigner怎么让一个模型跟着鼠标移动

【附源码】基于java的校园停车场管理系统的设计与实现61m0e9计算机毕设SSM_基于java技术的停车场管理系统实现与设计-程序员宅基地

文章浏览阅读178次。项目运行环境配置:Jdk1.8 + Tomcat7.0 + Mysql + HBuilderX(Webstorm也行)+ Eclispe(IntelliJ IDEA,Eclispe,MyEclispe,Sts都支持)。项目技术:Springboot + mybatis + Maven +mysql5.7或8.0+html+css+js等等组成,B/S模式 + Maven管理等等。环境需要1.运行环境:最好是java jdk 1.8,我们在这个平台上运行的。其他版本理论上也可以。_基于java技术的停车场管理系统实现与设计

Android系统播放器MediaPlayer源码分析_android多媒体播放源码分析 时序图-程序员宅基地

文章浏览阅读3.5k次。前言对于MediaPlayer播放器的源码分析内容相对来说比较多,会从Java-&amp;amp;gt;Jni-&amp;amp;gt;C/C++慢慢分析,后面会慢慢更新。另外,博客只作为自己学习记录的一种方式,对于其他的不过多的评论。MediaPlayerDemopublic class MainActivity extends AppCompatActivity implements SurfaceHolder.Cal..._android多媒体播放源码分析 时序图

java 数据结构与算法 ——快速排序法-程序员宅基地

文章浏览阅读2.4k次,点赞41次,收藏13次。java 数据结构与算法 ——快速排序法_快速排序法