You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2022/12/01 19:30:22 UTC

[GitHub] [airflow] p-madduri opened a new issue, #28042: Type Error while using dynamodb_to_s3 operator

p-madduri opened a new issue, #28042:
URL: https://github.com/apache/airflow/issues/28042

   ### Apache Airflow version
   
   Other Airflow 2 version (please specify below)
   
   ### What happened
   
   https://github.com/apache/airflow/blob/430e930902792fc37cdd2c517783f7dd544fbebf/airflow/providers/amazon/aws/transfers/dynamodb_to_s3.py#L39
   
   if we use the below class at line 39:
   def _convert_item_to_json_bytes(item: dict[str, Any]) -> bytes:
   return (json.dumps(item) + "\n").encode("utf-8")
   
   its throwing below error
   TypeError: Object of type Decimal is not JSON serializable.
   
   can we use
   
   class DecimalEncoder(json.JSONEncoder):
   def encode(self, obj):
   if isinstance(obj, Mapping):
   return '{' + ', '.join(f'{self.encode(k)}: {self.encode(v)}' for (k, v) in obj.items()) + '}'
   elif isinstance(obj, Iterable) and (not isinstance(obj, str)):
   return '[' + ', '.join(map(self.encode, obj)) + ']'
   elif isinstance(obj, Decimal):
   return f'{obj.normalize():f}' # using normalize() gets rid of trailing 0s, using ':f' prevents scientific notation
   else:
   print(obj)
   return super().encode(obj)
   
   and need to update the code at line 
   https://github.com/apache/airflow/blob/430e930902792fc37cdd2c517783f7dd544fbebf/airflow/providers/amazon/aws/transfers/dynamodb_to_s3.py#L99
   
   This solution is suggested in this article:
   https://randomwits.com/blog/export-dynamodb-s3
   
   Airflow version of MWAA : 2.0.2
   
   ### What you think should happen instead
   
   mentioned in what happened section
   
   ### How to reproduce
   
   mentioned in what happened section
   
   ### Operating System
   
   MAC
   
   ### Versions of Apache Airflow Providers
   
   from airflow.providers.amazon.aws.transfers.dynamodb_to_s3 import DynamoDBToS3Operator
   
   ### Deployment
   
   MWAA
   
   ### Deployment details
   
   n/a
   
   ### Anything else
   
   n/a
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] potiuk closed issue #28042: Type Error while using dynamodb_to_s3 operator

Posted by GitBox <gi...@apache.org>.
potiuk closed issue #28042: Type Error while using dynamodb_to_s3 operator
URL: https://github.com/apache/airflow/issues/28042


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] boring-cyborg[bot] commented on issue #28042: Type Error while using dynamodb_to_s3 operator

Posted by GitBox <gi...@apache.org>.
boring-cyborg[bot] commented on issue #28042:
URL: https://github.com/apache/airflow/issues/28042#issuecomment-1334258640

   Thanks for opening your first issue here! Be sure to follow the issue template!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org