The rapid advancement of artificial intelligence/machine learning (AI/ML) in recent years has captured global attention, particularly with the introduction of Generative AI and OpenAI’s latest Large Language Models (LLMs). The ChatGPT website from OpenAI has enabled regular users to engage with these models. This has created fierce competition among US tech behemoths (Microsoft, Amazon, Google, IBM, and others) and startups, constantly releasing new capabilities through updated models.

While the regular release of LLMs is new and exciting, it presents crucial questions for numerous institutions, especially Federal Civilian, DoD and Intelligence Agencies. Agency leaders question if AI/ML breakthroughs, such as LLMs, will address pressing issues, like minimizing visa application backlogs, and if so, how? Moreover, as the next generation of AI/ML capabilities emerges, organizations must consider how this progress will impact their existing solutions. New research and developments in smaller language models show the potential to outperform LLMs.

Instabase is actively developing the LLM-Engine as part of its Enterprise offering, which will operate both as a SaaS offering and in customer hosted Cloud Environments. The primary goal of the LLM-Engine is to abstract the complexities of developing enterprise grade applications leveraging the power of LLMs. It will address various challenges, including providing support to federal agencies as part of their roadmap. Additionally as LLM-Engine is part of the Instabase Enterprise Platform, customers can use the entire suite of models (LLMs, Layout aware Deep Learning Models) offered by Instabase to solve the most pressing challenges.

Initially, Instabase’s LLM-Engine will utilize LLMs from OpenAI and Azure OpenAI service. However, in the future, it will also support other LLMs based on customer requirements. Instabase is currently working on its own version of LLM based on the open source LLaMa2 model which will be available to customers by the end of this calendar year. Instabase has integrated with OpenAI / Azure OpenAI in a way that ensures customer data privacy. Customers will have the choice of using OpenAI / Azure OpenAI or Instabase’s own LLM along with other LLMs as they are released and Instabase makes them available on the platform.

It’s worth noting that the LLM-Engine offers additional capabilities not currently supported by LLMs. These include:

  • Encoding layout and style to handle complex use cases like tables
  • Support for long documents and multi-document use cases
  • Support for multimodal content (e.g, signatures, custom visual objects, barcodes)
  • Advanced refinements (e.g., checking against internal databases) and validations
  • Human-in-the-loop review / correction / verification
  • Enterprise Security and Compliance measures
  • Support for LLMs contained within the platform (InstalLLM) in addition to using API based LLM capabilities
  • Minimize the risk of hallucination to which LLMs are susceptible

Instabase’’s LLM capabilities for the private sector are currently available with its 23.07 release. Meanwhile, the roadmap includes LLM capabilities for the Federal Government with the development of its own LLM model that is expected in the last quarter of 2023. A beta version is already publicly available at aihub.instabase.com.

This platform has the potential to significantly impact operations at Federal Agencies, helping them address their challenges more effectively. For instance, using LLMs, Instabase can develop a model for identifying and extracting required information from commonly used document types, say an immigration A-file or a tax return with a single sample. This streamlines the process and saves considerable time and development resources. Once LLM for Federal is accessible, Instabase can extend this capability to identify and extract information from any document type, ranging from structured to semi-structured and fully unstructured documents, still requiring only a single sample for development. This opens up the opportunity to process various document types at scale, regardless of whether they are typed or handwritten. With advanced validation capabilities, Instabase can efficiently flag only applications that do not meet Agency validation rules for human review. The data from the remaining forms can be extracted and seamlessly sent to downstream case management systems. In summary, Instabase allows Federal agencies to harness AI/ML innovations to address critical challenges at the agency level.

Azure OpenAI’s recent authorization at FedRAMP High opens up opportunities for the Federal Government to solve complex document processing using the Instabase Platform. Additionally, with Instabase scheduled to release its own LLM, InstaLLM, Federal Government agencies have the option of using LLM capabilities contained within Instabase deployed on their own Cloud environment. With either option, Instabase offers Government Agencies a robust and secure pathway to leverage LLM capabilities to solve their complex document processing use cases.