3 min read

Announcing Generative AI and Interactive AI Solutions by MontyCloud

The Future of Autonomous CloudOps is Here

Generative AI is taking the world by storm as ChatGPT and Bard become household names, but organizations don’t know where to start. They worry about exposing their intellectual property, the costs of building their own models and what use cases to experiment with. This is no longer the situation with MontyCloud. Since Amazon Web Services (AWS) announced General Availability for Amazon Bedrock today, I can share a huge step forward in MontyCloud’s mission to democratize cloud. Our announcements in this blog help organizations of all sizes from SMBs to Enterprises, Managed Service Providers to Higher Education, and everyone in between. In minutes, MontyCloud’s Generative AI Blueprint and CloudOps Copilot, an Interactive AI CloudOps Agent, enable you to harness the power of generative AI and interactive AI safely in your business context without writing a line of code.

 
Generative AI Blueprint: Available Today September 28, 2023

Build generative AI-powered applications on AWS that connect to your organization’s data securely within your environment. Maintain full visibility, control, and operational agility at every step. The blueprint provisions everything you need to start training your own models; you just select the data. Our initial tests show that organizations can start building their own Large Language Model (LLM) chat bots on AWS starting under $100 a month in AWS infrastructure costs! This means you don’t need to learn services like Amazon Bedrock or understand how to configure Amazon SageMaker, Amazon OpenSearch, etc. More importantly, you won’t worry about setting up the correct data protection and generative AI policy guardrails. With the Generative AI blueprint, you can:

  • Provision a well-governed project workspace for your teams to play with approved LLMs as they experiment and build chat applications.
  • Setup automated governance guardrails and foundational best practices for the workspace.
  • Stay in full control of LLM in use, data residency, privacy, and cost.
  • Give your developers a ready to use API endpoint to start building GenAI applications that leverage your organization’s data.
  • Give your business users a ready to use chat application companion that is private to them and works within your organization’s boundary.
  • Business users can build their own Gen AI powered solutions such as custom chat bots.
  • Self-service build workspaces include services such as Amazon Bedrock, AWS SageMaker, AWS OpenSearch, secure networking, encrypted storage, and MFA enabled access controls.
  • LLMs supported by Amazon Bedrock will be available in the workspace, starting with Amazon Titan, Anthropic Claude, and AI21 Jurassic.
Use Case Examples:

           

  Business Example

A sales leader wants to provide sellers with self-service enablement. The sales leader provisions an entire generative AI environment along with a private chatbot in minutes and uploads sales and technical documents in a MontyCloud project. The sales leader and sellers ask the chat bot questions that draw directly from internal documentation, with guaranteed privacy and confidentiality.

  Developer Example

A developer wants to build an LLM powered application. The developer or Cloud Admin provisions the Generative AI Blueprint that creates an approved and secure endpoint to start writing code. The application securely accesses the endpoint via Amazon API Gateway. The endpoint delivers responses based on the vector embeddings created and stored securely within the MontyCloud project. Now, the developer can build secure LLM applications that are within the organization’s guardrails.

 
MontyCloud’s CloudOps Copilot an Interactive AI CloudOps Agent: Private Preview

MicrosoftTeams-image (45)MontyCloud’s CloudOps Copilot is the 24/7 Cloud Architect and seasoned Cloud Engineer you’ve been dreaming of adding to your team. CloudOps Copilot answers questions while optimizing your AWS infrastructure.

  • CloudOps Copilot combines the knowledge of Cloud Solution Architects, operational best practices and your own unique AWS footprint and applications.
  • CloudOps Copilot is context-aware and helps teams do everything from answering questions, building well-managed cloud infrastructure, automating routine operations and collaborating across digital transformation initiatives in a natural and interactive manner.
  • MontyCloud’s CloudOps Copilot is expected to be available in public beta before re:Invent 2023 (November 27, 2023).
 
Get Started

I’m not a solutions architect, but now I can safely provision a generative AI environment, upload data and start interacting with a chatbot in minutes with MontyCloud. What are you waiting for?

Don’t be the last one to implement generative AI for your organization. MontyCloud enables you to get started in minutes.

We invite you to try MontyCloud so you can innovate more, operate less and transform your business. The generative AI blueprint is gated so contact the MontyCloud team to get started. 

 

 


 

This is the second blog in our Generative AI/Interactive AI Solutions by MontyCloud series. The other blogs in this series are linked below.

 

Blog 1 - THE FUTURE OF AUTONOMOUS CLOUDOPS: FROM GENERATIVE AI TO INTERACTIVE AI  by Founders, Venkat Krishnamachari and Kannan Parthasarathy.