You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hudi.apache.org by Jaimin Shah <sh...@gmail.com> on 2019/10/24 11:54:56 UTC
Error while write table
Hi,
I recently moved from version 0.4.5 to release-0.5.0-incubating-rc6.
While writing data to Hudi MOR table I am getting error
*java.lang.NoClassDefFoundError:
org/apache/parquet/bytes/ByteBufferAllocator.*
* ByteBufferAllocator *is not part of parquet common which is packaged
with spark 2.3.2. So do I need to include any other packages in classpath?
As of I now I have included all spark dependency and
hudi-spark-bundle-0.5.0-incubating-rc6.jar .
Thanks,
Jaimin
Re: Error while write table
Posted by Jaimin Shah <sh...@gmail.com>.
Hi,
Thanks for the prompt response. It was a classpath issue and I was able
to resolve it.
Thanks,
Jaimin
On Friday, 25 October 2019, Bhavani Sudha <bh...@gmail.com> wrote:
> Hi Jaimin,
>
> I dont think we are excluding parquet-common anywhere. Could you paste the
> stacktrace?
>
> Thanks,
> Sudha
>
>
>
> On Thu, Oct 24, 2019 at 4:55 AM Jaimin Shah <sh...@gmail.com>
> wrote:
>
> > Hi,
> > I recently moved from version 0.4.5 to release-0.5.0-incubating-rc6.
> > While writing data to Hudi MOR table I am getting error
> > *java.lang.NoClassDefFoundError:
> > org/apache/parquet/bytes/ByteBufferAllocator.*
> > * ByteBufferAllocator *is not part of parquet common which is packaged
> > with spark 2.3.2. So do I need to include any other packages in
> classpath?
> > As of I now I have included all spark dependency and
> > hudi-spark-bundle-0.5.0-incubating-rc6.jar .
> >
> > Thanks,
> > Jaimin
> >
>
Re: Error while write table
Posted by Bhavani Sudha <bh...@gmail.com>.
Hi Jaimin,
I dont think we are excluding parquet-common anywhere. Could you paste the
stacktrace?
Thanks,
Sudha
On Thu, Oct 24, 2019 at 4:55 AM Jaimin Shah <sh...@gmail.com>
wrote:
> Hi,
> I recently moved from version 0.4.5 to release-0.5.0-incubating-rc6.
> While writing data to Hudi MOR table I am getting error
> *java.lang.NoClassDefFoundError:
> org/apache/parquet/bytes/ByteBufferAllocator.*
> * ByteBufferAllocator *is not part of parquet common which is packaged
> with spark 2.3.2. So do I need to include any other packages in classpath?
> As of I now I have included all spark dependency and
> hudi-spark-bundle-0.5.0-incubating-rc6.jar .
>
> Thanks,
> Jaimin
>