I'm having a dataframe which contains a really big integer value, example:
42306810747081022358
When I've tried to convert it to long it was working in the Java but not under the spark envrironment, I was getting
NumberFormatException: For input string("42306810747081022358")
Then I tried to convert it too Decimal (BigDecimal) value. Again, easily can do it in Java, but in Spark: dframe.withColumn("c_number",col("c_a").cast(new DecimalType()));
This way I don't get any exceptions, however I can see that all result values are null.
I also tried to use UDF for this purpose but get the same results:
UDF1 cTransformer = new UDF1<String, BigDecimal>() { @Override public BigDecimal call(String aString) throws Exception { return new BigDecimal(aString); } };sqlContext.udf().register("cTransformer", cTransformer, new DecimalType());dframe = dframe.withColumn("c_number", callUDF("cTransformer", dframe.col("c_a")));
And here again all I'm getting is a column with all zeroes.
How should I proceed?