Hi,
Env) Hadoop and IQ are separated from each other on other host.
1) CentOS 7 + Apache Hadoop 2.7.0 + Apache Hive 1.2 + Thrift hiveserver2
2) IQ16 SP08 PL32 + unixODBC 2.3.1 + Hortonworks Hive ODBC 2.2
I already unlink the IQ driver manager with white paper.
When I try to connect to Hadoop, The following error is returned.
(DBA)> create server hive_test class 'ODBC' using 'dsn=hive' |
Execution time: 0.003 seconds |
|
(DBA)> forward to hive_test |
0 row(s) affected |
Execution time: 0.001 seconds |
|
(DBA)> select * from testTable |
Could not execute statement. |
Unable to connect to server 'hive_test': [unixODBC][ |
SQLCODE=-656, ODBC 3 State="HY000" |
Line 1, column 1 |
select * from testTable |
Press ENTER to continue... |
Table(testable) is stored in HDFS.
[iq16@CentOS ~]$ /usr/bin/isql -v hive |
… SQL> select * from testTable; |
+------------+---------+---------------------------------+ |
| num | ename | kname | |
+------------+---------+---------------------------------+ |
| 1 | tom | xxx | |
| 2 | jack | xxx | |
| 3 | july | jjj | |
+------------+---------+---------------------------------+ |
Here are ODBC INFO:
[hive] |
Driver=/usr/lib/hive/lib/native/Linux-amd64-64/libhortonworkshiveodbc64.so |
HOST=localhost |
PORT=10000 |
Schema=default |
HiveServerType=2 |
HS2AuthMech=2 |
UserName=hive |
UID=hive |
PWD= |
|
[iq16] |
uid=dba |
pwd=sql |
eng=insung_iqdemo |
dbn=iqdemo |
commlinks=tcpip{host=localhost;port=2638} |
Driver=/home/hadoop/iq16_sp08_pl32/IQ-16_0/lib64/libdbodbc16.so |
Here are what I did.
- Localhost -> IP,Hostname in odbc
- Added the odbc and LIBPATH in .bash_profile
Ex) export LIBPATH="/home/iq16/IQ-16_0/lib64"
export ODBCINI="/home/iq16/odbc.ini"
3. Unlink the IQ odbc driver manager
4. Set LANG=C
…
…
Any comments and advice will be appreciated.
==
Gi-Sung Jang