You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Great Info <gu...@gmail.com> on 2018/07/13 12:30:02 UTC

spark rename or access columns which has special chars " ?:

I have a columns like below

     root
     |-- metadata: struct (nullable = true)
     |    |-- "drop":{"dropPath":"
https://dstpath.media27.ec2.st-av.net/drop?source_id: string (nullable =
true)
     |    |-- "selection":{"AlllURL":"
https://dstpath.media27.ec2.st-av.net/image?source_id: string (nullable =
true)
     |    |-- "dstpath":"
https://dstpath.media28.ec2.st-av.net/image?source_id: string (nullable =
true)


now there is a problem in select any column, since all the column have
special chars


*
"drop":{"dropPath":"https://dstpath.media27.ec2.st-av.net/drop?source_id
<https://dstpath.media27.ec2.st-av.net/drop?source_id>: string (nullable =
true)*

this column has special chars " : { and . .

how to select this column or rename in spark ?

 *
df.select('`metada."drop":{"dropPath":"https://dstpath.media27.ec2.st-av.net/drop?source_id`
<https://dstpath.media27.ec2.st-av.net/drop?source_id`>')*
gives error as  error: unclosed character literal



Regards
Indra