I'd love to hear how others are doing this. I've got several cql scripts that I use regularly, and I've build some of my own tools to handle things like injecting credentials/tokens to keep them secure.
I'm largely just wrapping the scripts within powershell, then scheduling the running of these using Windows task scheduler, and logging results using a node label: (:Cypherlogentry) (I know! - It's certainly not very sophisticated)
As I'm growing the data I'm working with, it's getting challenging to manage all my schedules (what's working, what got broken, etc).
How are others managing import/exports and your n4j/cypher processes?
I've explored some of the 3rd party (ETL) data management tools, but sometimes I think they are adding additional complexity, rather than helping me manage the ETL processes I've already designed.
Do I just need to buckle down and learn a tool like Pentaho Kettle?