Today a customer reached out to me asking how to debug the error shown in Hive below.
This error is caused due to not enough Heap Space.
Exception in thread "main" java.lang.OutOfMemoryError: Java heap spaceat
java.nio.HeapCharBuffer.(HeapCharBuffer.java:57)at
java.nio.CharBuffer.allocate(CharBuffer.java:331)at
java.nio.charset.CharsetDecoder.decode(CharsetDecoder.java:777)at
org.apache.hadoop.io.Text.decode(Text.java:412)at
org.apache.hadoop.io.Text.decode(Text.java:389)at
org.apache.hadoop.io.Text.toString(Text.java:280)at
org.openx.data.jsonserde.JsonSerDe.deserialize(JsonSerDe.java:165)at
org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:647)at
org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:561)at
org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:138)at
org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1623)at
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:267)at
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:199)at
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:410)at
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:345)at
org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:443)at
org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:459)at
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:739)at
org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)at
org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616)at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at
java.lang.reflect.Method.invoke(Method.java:606)at
org.apache.hadoop.util.RunJar.run(RunJar.java:221)at
org.apache.hadoop.util.RunJar.main(RunJar.java:136)
To resolve this, change the heap settings in Hive by using the following commands in Hive prompt:
set mapreduce.map.memory.mb=2048;
set mapreduce.reduce.memory.mb=4096;
select * from table1 LIMIT 1;