Guide
Best Practices
Best Practices for Using the Kambrium API
Optimizing Prompts for Accuracy and Efficiency
The Kambrium API is designed to process natural language instructions and execute actions within your integrated SaaS tools. However, to achieve the best results, it's essential to structure prompts clearly and effectively. Below are some best practices for maximizing the accuracy and efficiency of your API requests.
1. Structuring Effective Prompts
A well-structured prompt ensures that the LLM correctly interprets the request and takes the expected action.
Use Clear and Specific Instructions
✅ Good Example:
"Create a new lead for Peter Walker, CFO at AMC Inc. Include company data, add a note that he’s interested in the Quantum Analytics Suite, and set a follow-up reminder for next Tuesday."
❌ Avoid This:
"Add Peter Walker as a lead and do the necessary."
👉 Why? The second example is vague and does not specify the desired actions, timeline, or data enrichment.
Provide Context When Needed
If your request involves multiple steps, provide a clear sequence.
✅ Good Example:
"Retrieve the latest deal updates from Salesforce for John Doe, focusing on any changes made in the last 7 days. Also, summarize key notes added to the deal."
❌ Avoid This:
"Get John Doe's deal updates."
👉 Why? Adding a time frame and specifying what details are needed ensures a more relevant response.
Break Down Complex Requests
If a task involves multiple distinct actions, consider sending separate API calls instead of one overly complex prompt.
✅ Preferred Approach:
First request: "Retrieve the latest activity log for John Doe’s Salesforce deal."
Second request: "Summarize key action points from John Doe’s activity log."
❌ Avoid This:
"Get John Doe's latest activities, summarize important points, and also let me know if any deals changed."
👉 Why? Large multi-step requests increase processing time and can lead to ambiguous responses.
2. Monitoring and Managing Token Usage
Every request consumes tokens, and optimizing their usage helps control costs and improve response times.
Shorter, well-defined prompts are more efficient than long, ambiguous ones.
Avoid redundant phrasing that does not add value to the request.
Track your usage via the Kambrium Dashboard to manage costs effectively.
3. Execution Reliability & System Improvement Over Time
While Kambrium is designed to interpret and execute prompts as accurately as possible, we cannot guarantee that every request will be executed flawlessly due to the complexities of natural language processing and API interactions.
Why Some Prompts May Not Execute Correctly
The LLM may misinterpret vague or overly complex instructions.
The connected SaaS platform may return unexpected errors due to data inconsistencies.
Some third-party API limitations may affect execution success.
Continuous System Learning & Optimization
Kambrium is continuously improving through:
✅ User feedback-driven learning – Refinements based on real-world interactions.
✅ Ongoing LLM optimization – Fine-tuning prompt interpretation for accuracy.
✅ Enhancements to SaaS integrations – Expanding API coverage and reliability.
Over time, the system becomes smarter, handling more variations of prompts with increased accuracy.
4. Keeping Your API Key Secure
To protect your account and prevent unauthorized usage, never expose your API key in public repositories or client-side code.
🔹 Best Practices for API Key Management:
Store API keys in environment variables or a secrets manager.
Rotate keys periodically for added security.
Restrict API key usage to specific applications or IPs if possible.