Bridging traditional AI solutions and data engineering with cutting-edge cloud AI technologies to transform how businesses make decisions. Specializing in serverless AI architectures and intelligent automation. Based in Sydney, Australia.
Sydney-based AI engineer with 15+ years of experience spanning enterprise data engineering, workflow automation, and production AI systems.
I build intelligent systems that turn unstructured data into business value. My current work focuses on designing and deploying LLM-powered pipelines, multi-agent architectures, and RAG (Retrieval-Augmented Generation) systems for Australian businesses — from recruitment and financial services to the not-for-profit sector.
For a Defence & Government recruitment firm, I designed and built an end-to-end AI recruitment pipeline: CV ingestion, structured data extraction (skills, security clearances, capability domains), graph-based storage in ArangoDB, and a conversational AI agent accessible through Microsoft Teams. The system uses a multi-agent architecture — router, query builder, response formatter, and validator — orchestrated through n8n, enabling recruiters to find candidates through natural language queries instead of manual database searches.
For eThink Solutions — an award-winning Australian custom application and BI company whose clients include Lyft, GrainCorp, and DP World — I'm building a RAG system that enhances their mortgage broker CRM with document intelligence. The platform processes 150+ loan documents with 95%+ accuracy, classifies and chunks content at granular levels, maps relationships between banks, processes, and compliance requirements through a graph model, and lets staff query everything through natural conversation. Manual document lookup time has been reduced by 80%.
For not-for-profit organisations, I develop and deliver AI training programs that make AI accessible to non-technical teams, covering safe adoption frameworks, data governance, and practical tool implementation.
My foundation is over twelve years of enterprise data engineering across financial services and telecommunications — building ETL pipelines, designing data models, leading database migrations, and developing real-time KPI dashboards at scale. That experience gives me something many AI engineers lack: a deep understanding that even the most sophisticated models are only as good as the data architecture underneath them.
I see AI agents not as black boxes but as data systems requiring careful orchestration — ingestion, transformation, storage, retrieval, and presentation. My ability to bridge traditional data infrastructure with cutting-edge AI implementation means I deliver production-ready systems, not proof-of-concept demos.
I'm focused on helping forward-thinking teams deploy intelligent systems that deliver measurable business outcomes. Whether it's automating complex workflows, building conversational AI interfaces, or designing the data architecture that makes it all work — I bring the engineering depth to take projects from concept to production.
Get in touch: james@ethicalaispecialists.com.au | ethicalaispecialists.com.au
A showcase of my latest projects, demonstrating my skills in AI development, RAG systems, data engineering, and machine learning applications.
Built an end-to-end Voice of the Customer analytics pipeline that processes real insurance call centre transcripts through LLM extraction, stores structured insights in ArangoDB, and serves a live dashboard via webhook. Extracts 16 structured fields per call including calibrated sentiment scoring, sentiment journey tracking, churn risk assessment, topic classification, and agent performance.
Designing and building a production-grade RAG chatbot for the Australian mortgage industry, enabling brokers to query a large library of operational documents — checklists, bank-specific guides, and settlement procedures — using natural language. Built on AWS Bedrock with N8N workflow orchestration, ArangoDB graph database, and AI-powered document processing. Designed for source-attributed, grounded responses with full document traceability and hallucination prevention.
End-to-end intelligent recruitment candidate matching system combining graph database technology with AI-powered CV extraction and dual-retrieval search. Processes CVs through a multi-stage pipeline, stores candidates as a graph, and enables recruiters to find qualified candidates using both AQL graph queries and semantic vector search simultaneously through Microsoft Teams Avatar integration.
Developed an intelligent chatbot system using Amazon Bedrock to automate mortgage broker workflows. The system processes 150+ documents and delivers accurate, contextual responses with 95%+ accuracy, reducing manual lookup time by 80%. Features multi-modal query classification and vector-based knowledge retrieval.
Created and developed comprehensive AI training video course for not-for-profit organisations. Self-paced online training empowering nonprofit professionals to confidently and strategically implement AI through practical frameworks, hands-on exercises, and ethical AI principles. Features 3 sections with 21 lessons covering strategic AI partnership and communication.
Built an intelligent chatbot system that combines document retrieval with AI-powered responses for enhanced user experience. Features real-time semantic document search using vector embeddings and automated document processing pipeline.
Designed and implemented an end-to-end automated resume processing pipeline that transforms unstructured documents into structured candidate profiles within a leading CRM system, reducing manual data entry time by 95% for a recruitment consultancy.
Engineered an intelligent email processing system that automatically detects, extracts, and processes email with documents, transforming unstructured tender information into structured CRM deals with 90%+ accuracy, eliminating manual tender tracking overhead for a consulting firm.
I developed a Cleaning Company Management Multi Agent System that automates operations, optimizes resource allocation, and provides actionable insights for cleaning service providers. This project demonstrates my expertise in database design, workflow automation, and AI integration.
A multi-agent system using LangChain, CrewAI to help job seekers prepare application documents. The system analyzes job postings, compares them with your resume, and generates tailored documents like cover letters and optimized resumes.
A comprehensive system for nonprofit organization to analyze donor behavior, predict future contributions, and optimize fundraising strategies. Implemented using containerized machine learning environments for scalability.
Developing enterprise AI solutions using N8N and Amazon Bedrock. Delivered AI-powered mortgage broker assistant for eThink Solutions' client using AWS serverless architecture, processing 150+ documents with 95%+ accuracy. Built an AI-powered candidate matching system using ArangoDB graph database with dual-retrieval search (AQL + semantic vector) for the Australian Defence and Government recruitment sector. Creating intelligent automation systems including email parsing, CRM integrations, and task automation workflows.
Developed Docker-based ML environments for model development. Built predictive models for donor behavior analysis and implemented automated ETL processes using Apache Airflow.
Completed Master's in Software Engineering with specialization in AI at Torrens University Australia, developing advanced skills in machine learning, artificial intelligence, and software development.
Made the bold decision to relocate to Australia to pursue educational opportunities, learn English immersively, and explore new professional horizons in the field of artificial intelligence.
I developed a mobile application for KPI dashboards using Apache PhoneGap, enabling real-time data visualization and improved decision-making. Additionally, I created ETL packages in SSIS to facilitate seamless data integration across multiple sources. To enhance efficiency, I automated reporting systems using SQL Server, reducing manual effort and improving data accuracy. Furthermore, I implemented data scraping and processing solutions with Python, streamlining data collection and analysis for actionable insights.
I led the treasury database reporting migration project, ensuring a smooth transition and improved data accessibility. As part of this initiative, I implemented reporting solutions using Datamart and SAP Business Objects to enhance data analysis and visualization. Additionally, I managed the Murex system transition, optimizing financial data processes and system integration. I also created a VBA macro to automate the migration of information from contingency spreadsheets to the production environment, improving efficiency and reducing manual errors. My work involved leveraging Oracle Database and Business Objects to support accurate and efficient reporting for key stakeholders.
Completed a Specialization in Software Development at Universidad de Medellín, Colombia, gaining expertise in the software development lifecycle, Scrum methodologies, requirements elicitation, and other key aspects of software engineering.
Completed Bachelor in Computer Science at Politecnico Jaime Isaza Cadavid Medellin Colombia, developing my skills in coding, databases and software development.
Feel free to reach out for collaborations or just a friendly hello
jamonhin@gmail.com