Quantcast
Channel: SCN : All Content - SAP BusinessObjects Predictive Analytics
Viewing all articles
Browse latest Browse all 836

Requested array size exceeds VM limit when creating new data set via Hive (Apache Hadoop)

$
0
0

I am trying to create a chain in the "predict" tab of Predictive Analytics 2.0 (expert mode) but I get an issue when creating the new data set.

 

I tried to create the new data set via "Query with SQL" using "Apache Hadoop Hive 0.13 Simba JDBC4 HiveServe2" driver.

After adding the user/password and server:(port) info, I get an error message saying:

"requested array size exceeds VM limit (failed to allocate 1414811712 bytes) (array length 1414811695)"

 

Is this a heap size configuration problem or a bug with PA trying to create a huge array?

Could you please advise?

 

Thanks,

Veronica


Viewing all articles
Browse latest Browse all 836

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>