James Montoya

James Montoya

Artificial Intelligence Developer | Workflow Automation & AI Integration using N8N and Amazon Bedrock

Bridging traditional AI solutions and data engineering with cutting-edge cloud AI technologies to transform how businesses make decisions. Specializing in serverless AI architectures and intelligent automation. Based in Sydney, Australia.

Skills

N8N HubSpot Cloud Integration API Integrations LLM Agentic AI LlamaIndex Amazon Bedrock AWS Lambda Amazon S3 API Gateway OpenSearch Claude AI Neo4j Machine Learning Python SQL ETL Process SQL Server Oracle Tableau Power BI Excel VBA Macros JS React.js Node.js Docker PostgreSQL Communication Teamwork Problem Solving Adaptability Critical Thinking Creativity Leadership Mentorship Teaching Apache Airflow Vector Databases ArangoDB Graph Database AQL LangChain MLOps BigQuery CloudWatch Serverless Architecture RAG Systems

AI Engineer & Automation Specialist

Sydney-based AI engineer with 15+ years of experience spanning enterprise data engineering, workflow automation, and production AI systems.

I build intelligent systems that turn unstructured data into business value. My current work focuses on designing and deploying LLM-powered pipelines, multi-agent architectures, and RAG (Retrieval-Augmented Generation) systems for Australian businesses — from recruitment and financial services to the not-for-profit sector.

What I'm Building Right Now

For a Defence & Government recruitment firm, I designed and built an end-to-end AI recruitment pipeline: CV ingestion, structured data extraction (skills, security clearances, capability domains), graph-based storage in ArangoDB, and a conversational AI agent accessible through Microsoft Teams. The system uses a multi-agent architecture — router, query builder, response formatter, and validator — orchestrated through n8n, enabling recruiters to find candidates through natural language queries instead of manual database searches.

For eThink Solutions — an award-winning Australian custom application and BI company whose clients include Lyft, GrainCorp, and DP World — I'm building a RAG system that enhances their mortgage broker CRM with document intelligence. The platform processes 150+ loan documents with 95%+ accuracy, classifies and chunks content at granular levels, maps relationships between banks, processes, and compliance requirements through a graph model, and lets staff query everything through natural conversation. Manual document lookup time has been reduced by 80%.

For not-for-profit organisations, I develop and deliver AI training programs that make AI accessible to non-technical teams, covering safe adoption frameworks, data governance, and practical tool implementation.

Core Technical Stack

  • LLM Integration & AI: Claude (Anthropic API), OpenAI, Gemini, Grok, Ollama, Amazon Bedrock models, prompt engineering, multi-agent orchestration, RAG systems
  • Workflow Automation: n8n (self-hosted, production-grade), end-to-end pipeline design and deployment
  • Data & Databases: ArangoDB (multi-model: document, graph, vector search), PostgreSQL, advanced SQL, ETL/data pipeline architecture
  • Cloud & Infrastructure: AWS (EC2, S3, Bedrock), Docker, Traefik, serverless architectures
  • Integrations: HubSpot, Salesforce, Azure, Google APIs, any system with API connectivity
  • Frameworks & Tools: n8n, CrewAI, LangChain, Python, TypeScript

Background

My foundation is over twelve years of enterprise data engineering across financial services and telecommunications — building ETL pipelines, designing data models, leading database migrations, and developing real-time KPI dashboards at scale. That experience gives me something many AI engineers lack: a deep understanding that even the most sophisticated models are only as good as the data architecture underneath them.

I see AI agents not as black boxes but as data systems requiring careful orchestration — ingestion, transformation, storage, retrieval, and presentation. My ability to bridge traditional data infrastructure with cutting-edge AI implementation means I deliver production-ready systems, not proof-of-concept demos.

Let's Build Something Remarkable

I'm focused on helping forward-thinking teams deploy intelligent systems that deliver measurable business outcomes. Whether it's automating complex workflows, building conversational AI interfaces, or designing the data architecture that makes it all work — I bring the engineering depth to take projects from concept to production.

Get in touch: james@ethicalaispecialists.com.au | ethicalaispecialists.com.au

Featured Projects

A showcase of my latest projects, demonstrating my skills in AI development, RAG systems, data engineering, and machine learning applications.

Voice of the Customer (VoC) — AI-Powered Call Transcript Analytics Platform

Built an end-to-end Voice of the Customer analytics pipeline that processes real insurance call centre transcripts through LLM extraction, stores structured insights in ArangoDB, and serves a live dashboard via webhook. Extracts 16 structured fields per call including calibrated sentiment scoring, sentiment journey tracking, churn risk assessment, topic classification, and agent performance.

N8N Gemini 2.5 Flash ArangoDB Docker Traefik Chart.js Webhooks Prompt Engineering Python JavaScript

Enterprise RAG Document Chatbot — Intelligent Knowledge Retrieval System

Designing and building a production-grade RAG chatbot for the Australian mortgage industry, enabling brokers to query a large library of operational documents — checklists, bank-specific guides, and settlement procedures — using natural language. Built on AWS Bedrock with N8N workflow orchestration, ArangoDB graph database, and AI-powered document processing. Designed for source-attributed, grounded responses with full document traceability and hallucination prevention.

AWS Bedrock ArangoDB N8N Graph Database Claude AI Titan Embeddings RAG Docker JavaScript Python

AI-Powered Candidate Matching & Retrieval System Using N8N and ArangoDB from Microsoft Teams Avatar

End-to-end intelligent recruitment candidate matching system combining graph database technology with AI-powered CV extraction and dual-retrieval search. Processes CVs through a multi-stage pipeline, stores candidates as a graph, and enables recruiters to find qualified candidates using both AQL graph queries and semantic vector search simultaneously through Microsoft Teams Avatar integration.

N8N ArangoDB AWS EC2 Graph Database AQL GPT-4.1 Mini Vector Embeddings Semantic Search Microsoft Teams Nginx JavaScript
AI Mortgage Broker Assistant Architecture
Nov 2025

AI-Powered Mortgage Broker Assistant for eThink Solutions

Developed an intelligent chatbot system using Amazon Bedrock to automate mortgage broker workflows. The system processes 150+ documents and delivers accurate, contextual responses with 95%+ accuracy, reducing manual lookup time by 80%. Features multi-modal query classification and vector-based knowledge retrieval.

Amazon Bedrock Claude 3 Haiku AWS Lambda OpenSearch Python Vector Search n8n API Gateway

Confident with AI: A Practical Guide for Not-for-Profits

Created and developed comprehensive AI training video course for not-for-profit organisations. Self-paced online training empowering nonprofit professionals to confidently and strategically implement AI through practical frameworks, hands-on exercises, and ethical AI principles. Features 3 sections with 21 lessons covering strategic AI partnership and communication.

AI Training Course Development Video Production AI Strategy Nonprofit Sector Ethical AI

Production-Ready RAG (Retrieval-Augmented Generation) System

Built an intelligent chatbot system that combines document retrieval with AI-powered responses for enhanced user experience. Features real-time semantic document search using vector embeddings and automated document processing pipeline.

N8N PostgreSQL pgvector Next.js 15 React TypeScript Tailwind CSS OpenAI Anthropic Claude Docker Redis Vector Search
Automated Resume Processing & CRM Integration System
Jul 2025

Automated Resume Processing & CRM Integration System Using N8N Workflows

Designed and implemented an end-to-end automated resume processing pipeline that transforms unstructured documents into structured candidate profiles within a leading CRM system, reducing manual data entry time by 95% for a recruitment consultancy.

N8N LLM Models LlamaIndex Parsing CRM Integration HubSpot Cloud Services Google Drive, Docs, Sheets, Gmail, Calendar Microsoft 365, Outlook, oneDrive, Calendar

Automated Deal Processing & CRM Pipeline Using N8N Workflows

Engineered an intelligent email processing system that automatically detects, extracts, and processes email with documents, transforming unstructured tender information into structured CRM deals with 90%+ accuracy, eliminating manual tender tracking overhead for a consulting firm.

N8N LLM Models LlamaIndex Parsing Outlook Integration Document Parsing APIs CRM Integration HubSpot Cloud Services Google Drive, Docs, Sheets, Gmail, Calendar Microsoft 365, Outlook, oneDrive, Calendar
Comprehensive Cleaning Company Management System
May 2025

Comprehensive Cleaning Company Management System Using N8N Workflows

I developed a Cleaning Company Management Multi Agent System that automates operations, optimizes resource allocation, and provides actionable insights for cleaning service providers. This project demonstrates my expertise in database design, workflow automation, and AI integration.

PostgreSQL N8N OpenAI GPT Docker
Jan 2025

Jobseeker-Agent-system

A multi-agent system using LangChain, CrewAI to help job seekers prepare application documents. The system analyzes job postings, compares them with your resume, and generates tailored documents like cover letters and optimized resumes.

Python LangChain CrewAI OpenAI LLM React
ML-Powered Donor Analysis
2023-2024

ML-Powered Donor Analysis

A comprehensive system for nonprofit organization to analyze donor behavior, predict future contributions, and optimize fundraising strategies. Implemented using containerized machine learning environments for scalability.

Docker Python Apache Airflow Scikit-Learn sklearn KMeans

My Professional Journey

Artificial Intelligence Developer | Workflow Automation & AI Integration

Nov 2024 - Present

Developing enterprise AI solutions using N8N and Amazon Bedrock. Delivered AI-powered mortgage broker assistant for eThink Solutions' client using AWS serverless architecture, processing 150+ documents with 95%+ accuracy. Built an AI-powered candidate matching system using ArangoDB graph database with dual-retrieval search (AQL + semantic vector) for the Australian Defence and Government recruitment sector. Creating intelligent automation systems including email parsing, CRM integrations, and task automation workflows.

Data Analyst

Nov 2023 - Nov 2024

Developed Docker-based ML environments for model development. Built predictive models for donor behavior analysis and implemented automated ETL processes using Apache Airflow.

Master's in Software Engineering (AI)

2021 - 2024

Completed Master's in Software Engineering with specialization in AI at Torrens University Australia, developing advanced skills in machine learning, artificial intelligence, and software development.

Move to Australia

2018

Made the bold decision to relocate to Australia to pursue educational opportunities, learn English immersively, and explore new professional horizons in the field of artificial intelligence.

Data Engineering at Tigo

2014 - 2017

I developed a mobile application for KPI dashboards using Apache PhoneGap, enabling real-time data visualization and improved decision-making. Additionally, I created ETL packages in SSIS to facilitate seamless data integration across multiple sources. To enhance efficiency, I automated reporting systems using SQL Server, reducing manual effort and improving data accuracy. Furthermore, I implemented data scraping and processing solutions with Python, streamlining data collection and analysis for actionable insights.

Analyst Developer at Grupo Bancolombia

2006 - 2014

I led the treasury database reporting migration project, ensuring a smooth transition and improved data accessibility. As part of this initiative, I implemented reporting solutions using Datamart and SAP Business Objects to enhance data analysis and visualization. Additionally, I managed the Murex system transition, optimizing financial data processes and system integration. I also created a VBA macro to automate the migration of information from contingency spreadsheets to the production environment, improving efficiency and reducing manual errors. My work involved leveraging Oracle Database and Business Objects to support accurate and efficient reporting for key stakeholders.

I got a Specialization Software Development

2013

Completed a Specialization in Software Development at Universidad de Medellín, Colombia, gaining expertise in the software development lifecycle, Scrum methodologies, requirements elicitation, and other key aspects of software engineering.

I got a Bachelor Computer Science

2009

Completed Bachelor in Computer Science at Politecnico Jaime Isaza Cadavid Medellin Colombia, developing my skills in coding, databases and software development.

Get In Touch

Feel free to reach out for collaborations or just a friendly hello

jamonhin@gmail.com