Details
-
Type:
Bug
-
Status: Closed
-
Priority:
Major
-
Resolution: Incomplete
-
Affects Version/s: 4.2.0, 4.3.0
-
Fix Version/s: None
-
Component/s: con.spark
-
Labels:None
-
Environment:
The environment:
ubuntu 16.04
spark 2.3.3
hive 2.3.4
hadoop 2.8.5
hue 4.3.0
Description
When using pyspark to call hivecontext to query a table created by hive, field interpretation errors occur, causing the query to fail
Hive to create table
CREATE TABLE IF NOT EXISTS mdw.t_sd_mobile_user_log(
imei string,
start_time string,
end_time string,
type1 string,
jizhan_num string,
platform int,
app_type string,
app_name string,
sz_ll int,
xz_ll int
)
When using pyspark in hue to call hivecontext to query a table created by hive, field interpretation errors occur, causing the query to fail
Field values with strange values
imei#272, start_time#273, end_time#274, type1#275, jizhan_num#276, platform#277, app_type#278, app_name#279, sz_ll#280, xz_ll#281, statis_day#282
Traceback (most recent call last): File "/usr/local/spark-2.3.3/python/lib/pyspark.zip/pyspark/sql/dataframe.py", line 350:undefined, in show print(self.jdf.showString(n, 20, vertical)) File "/usr/local/spark-2.3.3/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in call answer, self.gateway_client, self.target_id, self.name) File "/usr/local/spark-2.3.3/python/lib/pyspark.zip/pyspark/sql/utils.py", line 63, in deco return f(*a, **kw) File "/usr/local/spark-2.3.3/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value format(target_id, ".", name), value) Py4JJavaError: An error occurred while calling o280.showString. : java.lang.AssertionError: assertion failed: No plan for HiveTableRelation mdw.t_sd_mobile_user_log, org.apache.hadoop.hive.serde2.OpenCSVSerde, **### imei#272, start_time#273, end_time#274, type1#275, jizhan_num#276, platform#277, app_type#278, app_name#279, sz_ll#280, xz_ll#281, statis_day#282_** at scala.Predef$.assert(Predef.scala:170) at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93) at