You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "zehra (Jira)" <ji...@apache.org> on 2022/07/07 08:04:00 UTC
[jira] [Updated] (SPARK-39708) ALS Model Loading
[ https://issues.apache.org/jira/browse/SPARK-39708?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
zehra updated SPARK-39708:
--------------------------
Priority: Critical (was: Major)
> ALS Model Loading
> -----------------
>
> Key: SPARK-39708
> URL: https://issues.apache.org/jira/browse/SPARK-39708
> Project: Spark
> Issue Type: Question
> Components: PySpark, Spark Submit
> Affects Versions: 3.2.0
> Reporter: zehra
> Priority: Critical
> Labels: model, pyspark
>
> I have an ALS model and saved it with these codes:
> {code:java}
> als_path = "saved_models/best"
> best_model.save(sc, path= als_path){code}
> However, when I try to load this model, it gives this error message:
>
> {code:java}
> ---> 10 model2 = ALS.load(als_path)
>
> File /usr/local/spark/python/pyspark/ml/util.py:332, in MLReadable.load(cls, path)
> 329 @classmethod
> 330 def load(cls, path):
> 331 """Reads an ML instance from the input path, a shortcut of `read().load(path)`."""
> --> 332 return cls.read().load(path)
>
> File /usr/local/spark/python/pyspark/ml/util.py:282, in JavaMLReader.load(self, path)
> 280 if not isinstance(path, str):
> 281 raise TypeError("path should be a string, got type %s" % type(path))
> --> 282 java_obj = self._jread.load(path)
> 283 if not hasattr(self._clazz, "_from_java"):
> 284 raise NotImplementedError("This Java ML type cannot be loaded into Python currently: %r"
> 285 % self._clazz)
>
> File /usr/local/spark/python/lib/py4j-0.10.9.3-src.zip/py4j/java_gateway.py:1321, in JavaMember.__call__(self, *args)
> 1315 command = proto.CALL_COMMAND_NAME +\
> 1316 self.command_header +\
> 1317 args_command +\
> 1318 proto.END_COMMAND_PART
> 1320 answer = self.gateway_client.send_command(command)
> -> 1321 return_value = get_return_value(
> 1322 answer, self.gateway_client, self.target_id, self.name)
> 1324 for temp_arg in temp_args:
> 1325 temp_arg._detach()
>
> File /usr/local/spark/python/pyspark/sql/utils.py:111, in capture_sql_exception.<locals>.deco(*a, **kw)
> 109 def deco(*a, **kw):
> 110 try:
> --> 111 return f(*a, **kw)
> 112 except py4j.protocol.Py4JJavaError as e:
> 113 converted = convert_exception(e.java_exception)
>
> File /usr/local/spark/python/lib/py4j-0.10.9.3-src.zip/py4j/protocol.py:326, in get_return_value(answer, gateway_client, target_id, name)
> 324 value = OUTPUT_CONVERTER[type](answer[2:], gateway_client)
> 325 if answer[1] == REFERENCE_TYPE:
> --> 326 raise Py4JJavaError(
> 327 "An error occurred while calling {0}{1}{2}.\n".
> 328 format(target_id, ".", name), value)
> 329 else:
> 330 raise Py4JError(
> 331 "An error occurred while calling {0}{1}{2}. Trace:\n{3}\n".
> 332 format(target_id, ".", name, value))
>
> Py4JJavaError: An error occurred while calling o372.load.
> : org.json4s.MappingException: Did not find value which can be converted into java.lang.String
> at org.json4s.reflect.package$.fail(package.scala:53)
> at org.json4s.Extraction$.$anonfun$convert$2(Extraction.scala:881)
> at scala.Option.getOrElse(Option.scala:189)
> at org.json4s.Extraction$.convert(Extraction.scala:881)
> at org.json4s.Extraction$.$anonfun$extract$10(Extraction.scala:456)
> at org.json4s.Extraction$.$anonfun$customOrElse$1(Extraction.scala:780)
>
> {code}
>
> I both tried to use `ALS.load` or `ALSModel.load` as shown in the Apache spark documentation:
> [https://spark.apache.org/docs/latest/api/python/reference/api/pyspark.ml.recommendation.ALS.html#:~:text=als_path%20%3D%20temp_path%20%2B%20%22/als%22%0A%3E%3E%3E][1]
>
> [1]: https://spark.apache.org/docs/latest/api/python/reference/api/pyspark.ml.recommendation.ALS.html#:~:text=als_path%20%3D%20temp_path%20%2B%20%22/als%22%0A%3E%3E%3E
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org