It's been a while since I imported very large datasets, but I'm pretty sure I succeeded in importing NA and did need to allocate a lot of memory, so that might be your problem. However, I don't think it is the Java heap that matters so much as the additional non-heap memory you can allocate with the
max-memory arguement. To see all options try:
java -Xms1280m -Xmx1280m -cp "target/osm-0.2.2-neo4j-3.5.1.jar:target/dependency/*" org.neo4j.gis.osm.OSMImportTool -h
The options that are relevant to memory are:
--max-memory <max memory that importer can use>
(advanced) Maximum memory that importer can use for various data structures and
caching to improve performance. If left as unspecified (null) it is set to 90%
of (free memory on machine - max JVM memory). Values can be plain numbers, like
10000000 or e.g. 20G for 20 gigabyte, or even e.g. 70%.
--cache-on-heap Whether or not to allow allocating memory for the cache on heap
(advanced) Whether or not to allow allocating memory for the cache on heap. If
'false' then caches will still be allocated off-heap, but the additional free
memory inside the JVM will not be allocated for the caches. This to be able to
have better control over the heap memory. Default value: false
I would suggest try a few options. I needed to increase my memory a lot to get some of the biggest datasets to load. I even allocated more swap memory off my SSD for one dataset to make it work.