You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Bago Amirbekian (JIRA)" <ji...@apache.org> on 2017/05/24 01:13:04 UTC
[jira] [Created] (SPARK-20862) LogisticRegressionModel throws
TypeError
Bago Amirbekian created SPARK-20862:
---------------------------------------
Summary: LogisticRegressionModel throws TypeError
Key: SPARK-20862
URL: https://issues.apache.org/jira/browse/SPARK-20862
Project: Spark
Issue Type: Bug
Components: MLlib, PySpark
Affects Versions: 2.1.1
Reporter: Bago Amirbekian
Priority: Minor
LogisticRegressionModel throws a TypeError using python3 and numpy 1.12.1:
**********************************************************************
File "/Users/bago/repos/spark/python/pyspark/mllib/classification.py", line 155, in __main__.LogisticRegressionModel
Failed example:
mcm = LogisticRegressionWithLBFGS.train(data, iterations=10, numClasses=3)
Exception raised:
Traceback (most recent call last):
File "/usr/local/Cellar/python3/3.6.1/Frameworks/Python.framework/Versions/3.6/lib/python3.6/doctest.py", line 1330, in __run
compileflags, 1), test.globs)
File "<doctest __main__.LogisticRegressionModel[23]>", line 1, in <module>
mcm = LogisticRegressionWithLBFGS.train(data, iterations=10, numClasses=3)
File "/Users/bago/repos/spark/python/pyspark/mllib/classification.py", line 398, in train
return _regression_train_wrapper(train, LogisticRegressionModel, data, initialWeights)
File "/Users/bago/repos/spark/python/pyspark/mllib/regression.py", line 216, in _regression_train_wrapper
return modelClass(weights, intercept, numFeatures, numClasses)
File "/Users/bago/repos/spark/python/pyspark/mllib/classification.py", line 176, in __init__
self._dataWithBiasSize)
TypeError: 'float' object cannot be interpreted as an integer
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org