Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 0.2.0
    • Fix Version/s: 0.3.0
    • Component/s: Data Module
    • Labels:
      None

      Description

      This looks like a recent change in Parquet:

      java.lang.OutOfMemoryError: Java heap space
      	at parquet.column.values.bitpacking.ByteBasedBitPackingEncoder.initPackedSlab(ByteBasedBitPackingEncoder.java:84)
      	at parquet.column.values.bitpacking.ByteBasedBitPackingEncoder.<init>(ByteBasedBitPackingEncoder.java:54)
      	at parquet.column.values.bitpacking.ByteBitPackingValuesWriter.reset(ByteBitPackingValuesWriter.java:63)
      	at parquet.column.impl.ColumnWriterImpl.writePage(ColumnWriterImpl.java:87)
      	at parquet.column.impl.ColumnWriterImpl.flush(ColumnWriterImpl.java:158)
      	at parquet.column.impl.ColumnWriteStoreImpl.flush(ColumnWriteStoreImpl.java:99)
      	at parquet.hadoop.ParquetRecordWriter.flushStore(ParquetRecordWriter.java:130)
      	at parquet.hadoop.ParquetRecordWriter.close(ParquetRecordWriter.java:96)
      	at parquet.hadoop.ParquetWriter.close(ParquetWriter.java:61)
      	at com.google.common.io.Closeables.close(Closeables.java:77)
      	at com.cloudera.data.filesystem.ParquetFileSystemDatasetWriter.close(ParquetFileSystemDatasetWriter.java:102)
      	at com.cloudera.data.filesystem.PartitionedDatasetWriter.close(PartitionedDatasetWriter.java:147)
      	at com.cloudera.data.filesystem.TestFileSystemDataset.writeTestUsers(TestFileSystemDataset.java:366)
      	at com.cloudera.data.filesystem.TestFileSystemDataset.testPartitionedWriterDouble(TestFileSystemDataset.java:217)
      

        Attachments

          Activity

            People

            • Assignee:
              tom Tom White
              Reporter:
              tom Tom White
            • Votes:
              0 Vote for this issue
              Watchers:
              0 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: