After days of tinkering and fine-tuning, I'm excited to share something I’ve been working on — OCI Generative AI Cohere Integration with external services. If you're someone who works with AI, Oracle Cloud, or just enjoys diving into new tech, this project might be something you'll find interesting.
What's the Project About?
In short, it's a simple yet powerful integration of Oracle Cloud’s Generative AI service with Cohere's language model by exposing it to external usage in order not to be confined to the chat experience within the OCI console. The goal is to provide an easy-to-use interface for AI-driven responses via a Flask API or function-based approach. Whether you're running the service locally or integrating it into your own applications, this project is meant to be a starting point for building smarter, AI-powered experiences that allow you the flexibility of using your preferred development platform outside of the playground in your Oracle OCI Tenancy.
Why This Project?
If you’re like me, you enjoy playing with new tools but
don’t always want to reinvent the wheel every time you build something. That’s
why I wanted to simplify integrating Oracle's powerful AI services into
everyday applications. The project takes the heavy lifting out of the process
and provides a clean, easy-to-use API for interacting with Oracle’s AI models.
But it doesn’t stop there. Whether you're building out
something bigger or just playing around with ideas, this project is designed to
be flexible and easily extendable.
Here’s What You Get:
- Flask
API: If you’re building a web-based application and want easy interaction
with the Generative AI service, the Flask API is ready to go. You can send
requests, get responses, and scale up as needed.
- Function-based
API: For those who don’t need a full web API, you can call the AI service
directly from a function that you can import into your Python projects. Perfect for smaller integrations or
experimenting within your own apps.
- Configuration
Flexibility: With environment variables and a configuration file, you have
full control over things like model parameters and Oracle Cloud
credentials.
How to Use It?
- Set up
your environment: Whether you're using the Flask API or the function-based
setup, you’ll need to configure your environment with Oracle Cloud
credentials and some basic settings.
- Get
running: Once configured, you can easily call the API or function with any
message you want to send to the AI model, and you’ll get a detailed
response in return.
I’ve added full instructions on the project’s GitHub page,
including setup steps, environment variables, and how to run both approaches.
What’s Next?
The goal is to keep expanding this project. Right now, it’s
all about integration, but as AI services evolve, I want to keep adding new
features and I likely will build apps that consume these interfacing endpoints in order to use the power of the Cohere LLM. Expect more updates on this as I add new functionality and more
tutorials to help everyone get the most out of the platform.
Feel free to dive into the project, contribute, or just ask
questions. I’m always happy to connect with fellow developers who are exploring
AI and Oracle Cloud.
Check it out on GitHub: OCI Generative AICohere Integration
I've added some screenshots below showing the interactions with the OCI Gen AI service via Postman by utilizing the Flask API, allowing extensibility outside of the chat experience in OCI which is the default experience.
By utilizing the API approach you can take advantage of the power of the Cohere LLM through Oracle OCI in your applications and data science tools while maintaining your security assurances of being inside of your OCI tenancy.
Thanks for reading, and happy coding!
OCI Gen AI Playground
Click on the images to expand them for ease of viewing
Postman Interaction with Flask API Exposing the OCI Gen AI Chat Service
OCI Gen AI Traffic from the API
No comments:
Post a Comment