The right AI consulting firm for your business is one where the person who presents to you is also the person building your solution. Before signing with any AI consulting firm, there are seven questions worth asking: who actually builds the work, what production systems can they show you, how they scope and price projects, what post-launch support looks like, how they handle Australian data privacy requirements, whether they have relevant industry experience, and what the commercial relationship looks like beyond the first engagement. These questions cut through the sales conversation quickly and separate firms that build and deploy production AI systems from firms that sell strategy and outsource execution.
The AI consulting market in Australia has expanded fast, and with it the range of what the phrase means. Some firms deliver strategy and roadmaps: useful if you need executive alignment and a documented plan, but they don't write code and won't ship anything themselves. Others are digital agencies that have added "AI" to a service list that already included web development and digital marketing, without the engineering depth to back it up. Others are genuine implementation shops where experienced engineers design, build, and deploy production systems. The sales process looks the same for all three. The right questions reveal the difference.
Question 1: Who Actually Builds the Solution?
This is the most important question and the one most firms are least prepared for. Ask directly: is the person you're currently talking to also the person who will write the code? Or, once you sign, does the work get handed to a delivery team, a pool of contractors, or an offshore resource?
The quality of an AI implementation is tied closely to the experience of the person doing the build. A senior engineer who has shipped production AI systems understands the failure modes from having encountered them, the compliance requirements from having had to satisfy them, and the integration challenges from having solved them for real clients. A junior team working from a brief produces a different result, and not in a good way.
Ask to meet whoever will be building your solution before you commit. If the answer is vague ("our delivery team will handle execution"), if the firm can't name a specific person, or if the conversation suddenly shifts to process and methodology rather than individual expertise, you're talking to a firm organised around sales volume. That's not automatically disqualifying, but you need to understand it upfront and weight it accordingly when comparing options.
Question 2: Can I See Production Systems You've Actually Shipped?
Portfolio decks, case study PDFs, and demo environments are easy to produce and tell you very little about implementation quality. Ask instead to see live systems currently in production use by real businesses. Ask what technology stack they used, what the integrations were, and what problems came up during the build.
The way firms answer this question is consistently informative. Implementation teams with genuine depth talk in specifics: the integration that needed a custom connector because the vendor's API didn't support the required authentication pattern, the data quality problem that required pre-processing before the model could be trained reliably, the production incident in week two and exactly how it was diagnosed and resolved. Firms with thin experience give you generic descriptions of "AI-powered workflows" and "intelligent automation" without substance.
If a firm can't point to specific live systems, the follow-up question is whether they have client references who can speak to the build quality and post-launch reliability of what was delivered. A reference call with a past client is more valuable than any portfolio document.
Question 3: How Do You Scope a Project and What Does Pricing Look Like?
The right answer is: a fixed-scope proposal priced per project, developed after a structured discovery phase. Hourly billing misaligns incentives in consulting engagements of any kind: the firm earns more revenue if the project takes longer. Fixed-scope proposals require the firm to understand your problem well enough to commit to a budget and a timeline. That discipline, doing the hard work of scoping before pricing, is a reliable proxy for how rigorous the firm's engineering process actually is.
Discovery phases are normal and appropriate for complex AI projects. A one or two week discovery phase to map data flows, understand integration points, identify compliance requirements, and define measurable success criteria is good practice. What you want to avoid is a discovery phase that produces another paid discovery phase rather than a fixed-scope build proposal.
When you receive a proposal, look at how specific the scope is. Vague deliverables ("an AI-powered document processing system") with a big number attached are a different proposition to specific deliverables ("automated processing of incoming referral letters with extraction of patient identifiers, document classification, and write-back to your practice management system via its REST API, with human review queues for extractions below 85% confidence"). Specificity in the scope statement reflects clarity of thinking about what's actually being built.
Question 4: What Happens When Something Goes Wrong After Launch?
Software deployed to production develops problems. This is true of all software, and AI systems in production are no exception. The quality question isn't whether problems will occur: it's how quickly they'll be caught, how clearly they'll be communicated, and how rapidly they'll be resolved.
Ask specifically about the warranty period, what's covered during that period, and what the response time commitment is for production issues. A firm that builds carefully will stand behind its work with specific commitments: a defined warranty period covering bugs in the delivered scope, a response time SLA for production incidents (hours, not days), and a clear process for distinguishing a defect in the delivered scope from a new change request.
Vague reassurances like "we're always available to help" without specifics mean either the firm hasn't thought this through or has thought through it and decided not to commit. Both interpretations are informative. Post-launch support is where the difference between a well-built system and a rushed one becomes obvious, and it's where good firms differentiate from average ones consistently.
Question 5: How Do You Handle Data Privacy and Australian Compliance Requirements?
Any system handling customer data, financial information, health records, or other regulated data needs to be built around Australian data residency and privacy requirements from the architecture stage. Retrofitting compliance controls into a system that wasn't designed with them is expensive and typically incomplete.
Ask specifically: where will our data live, who will have access to it, how is it encrypted in transit and at rest, and what is the process if there's a breach? The answers should be concrete. Your data should be stored in Australia using domestic cloud infrastructure (AWS Sydney, Azure Australia East, GCP Sydney, or equivalent Australian-hosted providers). Access should be restricted to authorised users with audit logging in place. Encryption should be standard. Breach notification should follow the Notifiable Data Breaches scheme under the Privacy Act 1988.
Firms that have built in regulated sectors (healthcare, financial services, professional services handling client data) have well-formed answers to these questions. Firms without that experience give you abstract assurances. If your business operates in a regulated sector, this question is particularly important: sector-specific requirements like APRA's CPS 234 for financial services, the health records legislation for healthcare providers, or state-level privacy obligations for certain government-adjacent services need to be understood before a line of code is written.
Question 6: Do You Have Experience in My Industry or With My Problem Type?
Industry experience compresses project timelines and reduces risk in specific, practical ways. A firm that has built workflow automation for healthcare providers already understands how to handle health data compliantly, which practice management systems have workable APIs, what clinical and admin staff need from an intake workflow, and what the common failure modes are. They've already made the mistakes and corrected them on someone else's engagement and budget.
That said, industry experience isn't always available and isn't always essential. A technically strong generalist firm can deliver good outcomes in industries they haven't worked in before, provided they take the discovery process seriously and build compliance requirements into the architecture from the start. The key question, if they don't have direct industry experience, is how they'll develop sufficient domain knowledge to scope and build accurately. Asking to see examples from adjacent industries and speaking to references from those engagements gives you a reasonable signal.
Question 7: What Does the Commercial Relationship Look Like Beyond the First Project?
A well-built AI system should be maintainable by your team with reasonable documentation and support, and should not create perpetual dependency on the consulting firm to function. Ask: after delivery, will your internal team be able to make configuration changes and handle routine issues, or will you need to call the consultant for every modification?
Ask about documentation practices, how operational knowledge is transferred during handover, and whether ongoing support retainers are available and at what cost and scope. The right firm builds for handover. They want to do good work that earns repeat business on new projects, not create a situation where you're locked into paying ongoing fees to keep an existing system operational. Firms that resist giving you clear ownership of the code, documentation, and operational knowledge are worth being cautious about.
What the Answers Tell You
These questions don't require technical expertise to ask or to evaluate. They're operational and commercial questions any business leader can assess. Firms that give you direct, specific answers are worth progressing with. Firms that deflect, go vague, or redirect to their methodology rather than their actual delivery history are telling you something important.
In my experience, the most reliable signal is specificity. A firm that can talk about specific production systems, specific technical challenges, and specific post-launch incidents has the engineering depth that comes from actually building things. A firm that talks primarily about its process and methodology, without being able to ground it in concrete examples of work delivered, probably hasn't built as much as the sales conversation implies.
If you're evaluating AI consulting options and want to understand how ForgeIT approaches engagements, the services page covers the types of projects we take on and the way we work. Discovery calls are always free and without any commitment.