Single conversational support channel
Replaces fragmented phone-based inquiries across multiple departments with one intelligent AWS-powered conversational entry point
SoftwareOne case study

How AWS-based agentic AI unifies fragmented member support into a single intelligent conversational experience
A national business association in the APAC region is modernizing how it serves tens of thousands of member companies by implementing an AWS based agentic AI assistant, designed and delivered by SoftwareOne (formerly Crayon). The solution demonstrates how Amazon Web Services (AWS) agentic and generative AI services can safely front a complex public-facing inquiry process, while laying a scalable foundation for future automation and agentic transformation.
Replaces fragmented phone-based inquiries across multiple departments with one intelligent AWS-powered conversational entry point
Implemented a governed, extensible architecture using Amazon Bedrock, Bedrock Flows, and Amazon Lex to support future AI expansion
Enabled users to receive curated, context-aware answers while automatically routing unresolved or complex cases to the appropriate support teams
The customer is the preeminent body representing business in a major economy, championing a broad membership of companies above a defined annual revenue threshold. Its remit spans policy support, trade facilitation, and practical programs such as upskilling and workforce development, delivered through multiple specialized business units.
Each unit maintained its own documentation and processes, from trade agreements to training subsidies, resulting in an extensive, highly distributed knowledge base. Inquiries from members and the public were handled almost entirely by phone, with separate contact numbers for different floors and departments, and a small frontline team fielding questions ranging from basic procedural queries to detailed regulatory issues. This model created bottlenecks, increased the risk of mis‑routing calls, and made it difficult to provide consistently accurate, up‑to‑date information at scale.
The organization recognized that modernizing member support required a solution capable of navigating this complexity. The rapidly emerging fields of generative AI (GenAI) and AI agents offered the ideal tools to bridge the gap between fragmented data and a seamless user experience.
At the point of engagement, the organization had no production chatbot in place. Previous attempts to experiment with alternative cloud tooling for conversational interfaces had proved difficult to scale, particularly around constraining responses to trusted content and handling more complex, multi-step interactions. The customer therefore sought a partner that could design a modern AI assistant aligned with its governance requirements and future roadmap.
The AWS agentic system ensures users always interact with the latest information.

The organization’s cloud estate was already hosted on AWS, with SoftwareOne acting as key account manager. This existing relationship gave the customer confidence that a new AI assistant could be integrated cleanly into its infrastructure and operated within existing security and compliance guardrails.
AWS was selected as the platform for the assistant because it offers an end-to-end stack for agentic AI, from foundation models to orchestration and governance. Amazon Bedrock provides access to a range of leading models behind a managed service, allowing the team to select models that best fit retrieval-augmented question answering and conversational flows. Crucially, AWS’s agentic capabilities – such as Bedrock Flows and AgentCore – make it possible to design assistants that can orchestrate multi-step tasks, manage conversation state and call tools or back-end systems rather than simply returning a single model response.
In less than six months, SoftwareOne designed and implemented a customer‑facing AI assistant that exposes a single digital entry point for inquiries, initially focused on the organization’s upskilling and training programs. Within this pilot domain, four departments cover the end‑to‑end journey: enrollment and onboarding, training and claims support, changes during participation, and completion and reconciliation.
At the core of the solution is an agentic system built on AWS. Amazon Bedrock provides the foundation model layer and underpins the reasoning and natural-language capabilities of the assistant. The team used Bedrock Flows and related orchestration tooling to define multi-step workflows that allow the assistant to interpret user intent, decide whether to retrieve knowledge, ask clarifying questions, or route the interaction to a human support path.
The solution's architecture is designed with the flexibility to integrate additional components, such as Bedrock AgentCore, which provide options for potential interfacing with external systems for data gathering and developing specialized, requirements-driven tooling to benefit the overall AI assistant.
All this enables richer “agentic” behaviors, such as following up to determine whether a user is asking about a planned course or an in-progress enrolment before providing guidance.
The accuracy and helpfulness of this guidance depend on the quality of the underlying data, which must be up-to-date at all times.
The AWS agentic system ensures users always interact with the latest information. Knowledge ingestion and retrieval are handled through automated pipelines that crawl and index curated internal content across participating departments. These pipelines refresh the knowledge base on a scheduled basis, so that updated documents, contact details, and process changes are incorporated without manual reconfiguration.
The conversational interface is exposed via an AWS-integrated chat front end, with Amazon Lex providing natural language understanding and integration into web entry points. Bedrock Guardrails are applied across the flow to enforce safety and the organization’s governance policies, including restricting responses to vetted sources and filtering harmful or out-of-scope content. These safeguards ensure the system operates as a responsible agent with appropriate pathways for resolution.
When the assistant cannot satisfactorily answer a query, it automatically generates a ticket for the relevant support team, including conversation context and contact details, to enable informed human follow-up.
The project demonstrates how AWS agentic services can transform complex support environments into scalable, user-friendly digital experiences.
Learn how SoftwareOne helps organizations design, govern, and scale AI solutions on AWS.
Learn how SoftwareOne helps organizations design, govern, and scale AI solutions on AWS.
Before the project, users navigated multiple phone numbers corresponding to different floors, often needing to guess which department could handle their question. First‑line inquiries were handled entirely by a small group of staff, who then chased down information or redirected calls, leading to delays and inconsistent experiences.
The new assistant replaces this fragmented entry point with a single conversational channel capable of handling common questions in natural language. Users can ask about how to enroll, which forms to submit, how to claim credits, or what happens when employee participation changes, and receive responses grounded in the organization’s curated documentation. Guided journeys help users understand next steps, such as where to apply, which supporting documents are needed, or how to track status.
Because the system’s interactions are agentic rather than purely scripted, the solution generalizes beyond simple FAQ patterns. The assistant can decide when to ask clarifying questions, when to surface specific contact details, and when to escalate to human agents, based on the intent and complexity of each interaction. At the same time, the underlying architecture is designed so that additional units and departments can join the platform by supplying their content and contact information, without rebuilding the core bot logic.
The engagement delivered a modern platform that transforms how the organization diffuses knowledge to its users. The result is an experience that makes information easier to find and interactions more intuitive.
Internally, the assistant is already being used by staff across the building to test behavior. Early feedback has been positive, with internal testing indicating a high level of satisfaction and demonstrating the effectiveness of the AI assistant in addressing user queries.
For AWS, the case exemplifies how partners can combine foundation models, orchestration services, and governance capabilities into an end‑to‑end agentic assistant that goes beyond a standalone chat interface. For the customer, it marks a significant step in its AI adoption journey: moving from manual call‑based support and isolated experiments to a coherent, scalable, and responsibly governed AI platform built on AWS.
By combining AWS foundation models, orchestration, and governance capabilities, the organization created a scalable platform for future AI innovation.

Share a few details about your business challenge, and we’ll get right back to you.
Share a few details about your business challenge, and we’ll get right back to you.