What is CORE AI?
CORE AI builds AI products
We have:
Cloud Query, which augments a DevOps engineer's ability to build, manage, understand and migrate cloud infrastructure built on a combination of customised LLMs (Large Language Models), fine-tuning, RAG (Retrieval Augmented Generation) and proprietary API-aware stacks.
Mark Papers, a tool to help mark papers for teachers.
Why are you called CORE AI?
CORE stands for Contextual Response Engine.Â
Cloud Query ingests existing Infrastructure as Code repositories such as Azure ARM templates, AWS CloudFormation or Hashicorp Terraform and documentation such as Confluence and Markdown, processes the content and then lets you build, manage, migrate or understand your infrastructure via contextual responses.
Mark Papers ingests student's exam papers and creates reports for teachers.
Why CORE AI?
Here's why we'd recommend CORE AI over an LLM like Bard, ChatGPT or Claude.
Your data is secure. Your data will not be sent over the wire to a third party services such as ChatGPT.
LLMs suffer from hallucinations (they make up completions that don't exist). CORE AI is built to ensure we're anchored to ground truths.
For Cloud Query: LLMs are limited by cut-off dates (the time training is completed). For example, GPT-4 has a cut-off date of September 2021 by which time essential infrastructure and coding libraries are long out of date or deprecated. CORE AI uses the latest cloud APIs and coding libraries. If an API is deprecated yesterday, CORE AI will know about it today. In addition, it allows you to ingest your Infrastructure as Code and can fine-tune its models against your code base and documentation. Other LLMs are not aware of your code base.