Followed
http://hadooptutorial.info/avro-mapreduce-word-count-example/ but caution: I need to set the javac CLASSPATH instead to
export CLASSPATH="$HADOOP_HOME/share/hadoop/tools/lib/*"
export CLASSPATH="$HADOOP_HOME/share/hadoop/mapreduce/*:$CLASSPATH"
Added yarn.resourcemanager.address per answer.
this time didn't get same error, but job seems stuck.
issue stop-all.sh and modifying hadoop config files to match single-node installation.
tests under single-node Testing section gave same error as before: INFO ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8032.
Reinstalling to local user dir per alexjf blog.
Still attempts to connect to 0.0.0.0/0.0.0.0:8032
tried tests got error this time
could not login.
try reinstall OS.
but would not work.
tried the hadoop installation and test from alexjf blog. It failed the test, pulling up the url, shows memory limit exceeded as cause of failure.
removed all settings from yarn-site.xml and stopped restarted using $HADOOP_PREFIX/sbin/stop-yarn.sh stop-dfs.sh start etc. but got seeming infinite running.
used https://coderwall.com/p/a5kbtw/installing-apache-hadoop-on-linux first set of instructions to enable passwordless ssh.
restart hadoop services and didn't ask for password this time. test still fails.
modified command set --num_containers 1 --master_memory 512 and says completed successfully.