Details
-
Type:
Bug
-
Status: Resolved
-
Priority:
Major
-
Resolution: Fixed
-
Affects Version/s: 0.3
-
Fix Version/s: 0.3
-
Component/s: Interpreter
-
Labels:None
Description
HiveContext is always created no matter when we enable hiveContext through spark.repl.enableHiveContext.
The root cause is that we depends on shell.py of spark. As the following codes shows, as long as we build hive with spark, we would always created HiveContext, and unformatenlly HiveContext would initialize itself when any method is called. So the following line 'sqlContext=HiveContext' would throw any exception.
try: # Try to access HiveConf, it will raise exception if Hive is not added sc._jvm.org.apache.hadoop.hive.conf.HiveConf() sqlContext = HiveContext(sc) except py4j.protocol.Py4JError: sqlContext = SQLContext(sc) except TypeError: sqlContext = SQLContext(sc)