SVMWithSGD in Spark documentation example not working -


I'm running SPARC 1.1.0 with PySpark.

When I run the example taken directly from the documentation:

Import from pyspark.mllib.regression LabelingPoint from pyspark.mllib.classification Import SVMWithSGD Import Array Data = [LabeledPoint LabelingPoint (1.0, [2.0] ]), LabelPoint (1.0, [3.0]) svm = SVMWithSGD.train (sc.parallelize (data)) svm. Prediction (array ([1.0])>

I get an error:

  type error traceback (most recent call final) & Lt; module & gt; () 12 13 svm = SVMWithSGD.train (sc.parallelize (data)) in & lt; ipython-input-5-68bcda022b28 & gt; 14 svm.predict (array ([1.0]) gt; Error type: 'module' object is not worth the corner  

What could be the problem?

I was importing incorrect array package I

  import array   

  numpy income From the array  

Comments

Popular posts from this blog

winforms - C# Form - Property Change -

javascript - amcharts makechart not working -

java - Algorithm negotiation fail SSH in Jenkins -