
In our latest project, we needed to create an almost autonomous AI manager that users would be able to call and order bike products from an online store based on Medusa - no web browser needed, just their voice.
The goal was to provide a smooth, automated experience with voice technology while ensuring accurate data collection and seamless integration with backend systems.
To address these challenges, our team developed a custom AI Voice Agent, based on BlandAI, designed to act as a real support agent in a Medusa eCommerce environment. It processes information provided by the caller, adjusts its tone for more natural interactions, and connects to live APIs to fetch real-time data and presents it back to the user during the call.
In this article, I’ll walk you through how we designed and built this feature, focusing on the key steps, decisions, and technologies involved - told from the perspective of a bike store owner aiming to simplify customer service with AI.
Challenge
Many users prefer hands-free experiences, especially on mobile. The client wanted to create a voice-ordering feature that reduces friction in the checkout process and increases conversions.
We needed to create an AI voice solution capable of handling customer interactions seamlessly. Let’s outline the scope of work this assistant would need to handle. We're trying to break down every step, almost like a script the assistant would follow if it were a real person handling the call. So, each time someone calls our infoline, the AI would need to go through the following steps:
-
Answer the call.
-
Greet the caller and introduce itself.
-
Say a few words about the company.
-
Collect the required information (like height, weight, preferred color, urgency, purpose of the bike, brand, model, etc.).
-
Search the webshop for bikes that match the given requirements.
-
Suggest one, two, or as many options as the caller wants.
-
Once the caller chooses a specific bike, ask if they’d like to receive a checkout link via email and/or text message.
-
Send the link through the chosen channel.
-
Inform the caller about store policies and the bike guarantee, then politely end the call.
We were working with a pre-existing pathway built on predefined prompts, which dictated how the AI should behave, the voice tone, and which parameters to extract. However, there was no integration with live servers, no data queries - just static prompts and the AI’s best attempt to simulate a functional conversation.
Our challenge was to enhance this system by introducing custom features, improving AI options, and ensuring the agent could retrieve and process relevant information accurately.
Now that we’ve outlined the flow - or really, the algorithm - we can move on to building the AI to follow it.
Solutions
Preparing the conversational pathways
After outlining and writing down the requirements for the assistant, it was time to put everything into action - or more accurately, into BlandAI, a platform that allows us to build conversational pathways.
So, what exactly are these pathways? As mentioned earlier, they’re essentially a set of nodes (or “boxes”), and each one can represent a single step from the previous section. We can also define variables - like brand, model, height, etc. - and use them either to call an API or to navigate to another node within the pathway.
These nodes are connected by routes, and each route has a condition that defines when it should be followed - for example, “take this route if the user doesn’t want a bike,” which could then lead to a polite goodbye. You get the idea.
Below is an infographic of our example pathway.

This is exactly the approach we used when developing the feature. We stepped into the shoes of this virtual assistant, mapped out its responsibilities, and designed the conversational pathways to reflect the logic of the script we had previously prepared.
Fine-tuning the pathways
At this stage of development, we spent a lot of time fine-tuning and tweaking the assistant’s behavior. Initially, we worked only with prompts, without connecting to real data - mostly because AI is great at making things up, including imaginary products. So instead, we focused on shaping the overall user experience.
For example, we wanted to make sure that when someone calls, they’re greeted warmly and professionally. The assistant should introduce itself, explain its role, and ask about the reason for the call. Below is a very shortened version of the initial prompt used at the beginning of the pathway:
You are a voice assistant at BikerStore, a premium bike store.
After starting the call, warmly greet the caller, introduce yourself as Joey,
explain your role in the process, and ask the client how you can be of service.
Remember to be polite and enthusiastic.
This part took a lot of time and testing. We wanted the assistant to behave exactly as expected - following the script, not skipping over important information, and not jumping to unrelated nodes too early.
A good example of where things had to work perfectly is when we needed to extract product-related information from the user. Since the assistant needs to look up matching bikes later, we had to make sure it could gather key details reliably. Fortunately, BlandAI makes this fairly simple. We’d create a new node and set the prompt like this:
Extract the user's height and weight, along with their preferred bike color
and preferred brand and/or model.
Do not proceed until all of this information has been collected.
Be polite, and don’t rush the client.
In BlandAI, each node allows us to define which variables to extract. Once captured, they’re stored as global variables, so the assistant can access them at any point in the conversation - for example, when calling an API or sending the checkout cart.
One important setting here is the option to loop the node until all variables are extracted. Sure, you could create one node per variable (so, n nodes for n values), but that would clutter the pathway and make the whole thing hard to manage and maintain.
Before connecting to a real-world API, we needed to test everything thoroughly. BlandAI offers three main ways to test:
-
Using the Test Pathway feature directly in the platform.
-
Making a real test call to walk through the entire flow.
-
Writing simple tests for specific nodes.
Overall, we found these tools super helpful in making sure things worked exactly as planned. If you want to dive deeper into testing BlandAI pathways, you can check out this module: BlandAI: Module 5 | Lesson 3: Feedback and Monitoring.
Preparing the API and connecting it to BlandAI
Once the full flow was built inside Conversational Pathways - complete with prompts, routing, and variable extraction - and we confirmed everything worked to our standards, it was time to connect the assistant to a real API.
In our bike store assistant example, we knew we’d need at least two essential webhooks in our API:
-
Search – to look for bikes based on the criteria collected from the caller.
-
Send – to send a cart link via email and/or text message to the caller.
I won’t dive into the implementation details of the API itself - you’re free to use whatever language or stack you prefer. The important part is making sure to sign your webhooks, as required by BlandAI. What matters is how you use them in connection to BlandAI. And there are two different ways:
-
Webhook node
-
Tools
Webhook node
A webhook node is a standard step in the conversation pathway that triggers an external API call. It’s pretty flexible - you can use previously gathered variables as either query parameters or request body content.
If you want to use query parameters, there’s a neat trick that worked well for us. When collecting data in a prior node, define a variable like searchUrl and add a short mini-prompt to guide the AI on how to build it:
This variable will be used in a webhook call. The initial value is \`localhost:3000\`.
Whenever the user provides a value for example, height - append it as a query parameter like so:
\`localhost:3000?height=100\`
Do this for height, weight, brand, model, and color.
And it's done! If a webhook node is present later in the pathway all you need to do is to set the URL in webhook node to {{searchUrl}}
and voilà!
Tools
Another option is using BlandAI Tools, which are a bit more advanced and flexible - but also a bit harder to control. Tools let you define all the usual fields of a webhook (URL, method, parameters, body, etc.) plus an extra one: the usage field.
In the usage field, you tell the AI when and how the tool should be used. It’s helpful to include a few examples so the assistant can recognize the pattern. In theory, the AI will then automatically decide when to use the tool and how to apply the response (e.g., to recommend a bike based on the returned data).
In practice, we found tool usage to be less reliable when prompts got long or complex. To improve consistency, you can directly instruct the AI to use a specific tool like this:
When you gather all the information, perform a search using {{searchTool}}.
Just make sure the name of the tool is placed between double curly brackets.
Testing
After all those steps are complete, one last thing is to run a bunch of tests to check if all functionalities work as expected.
-
Use BlandAI’s built-in Test Pathway feature.
-
Try real calls to see how the assistant behaves.
-
Add more examples in your prompts to help the AI better interpret user responses.
-
Experiment with global variables, tool usage, and node fine-tuning, while using BlandAI testing tool.
Tweak, test, repeat. Once everything works as expected, your AI voice assistant will be ready to handle real conversations - with real data.
Final Results
This feature demonstrates how custom voice technology solutions lead to: faster processing times, fewer human errors, and reduced operational costs, making AI voice agents a scalable solution for eCommerce businesses.
Custom development projects like this highlight the importance of tailored solutions in improving workflows, increasing efficiency, and delivering a better experience for a global audience.
We hoped this article shed some light onto the process of developing such sophisticated robots. Thank you for your attention!
Transform your business with our custom solutions!
Let's talk about your projectOther blog posts
Building Automating USPS Shipping Label Generation in Medusa
Explore how we created custom software solution for automated USPS shipping label generation in Medusa...
Building a Custom Order Mechanism for Trading Events in Medusa
In this article, we’ll show you how we created a custom order mechanism for trading events in Medusa...