Dev

Microsoft unveils TypeChat library for building natural language interfaces


Microsoft with its TypeChat library is looking to enable easy development of natural language interfaces to large language models (LLMs) using types.

Available on GitHub, TypeChat is an open source library that uses TypeScript and generative AI to bridge natural language, application schema, and APIs. TypeChat uses type definitions in your application to retrieve structured AI responses that are type-safe.

Introduced July 20 by a team featuring C# and TypeScript lead developer Anders Hejlsberg, a Microsoft technical fellow, TypeChat addresses the difficulty of developing natural language interfaces, with apps relying on complex decision trees to determine intent and collect required inputs to take action.

TypeChat replaces prompt engineering with schema engineering, TypeChat’s creators said. Developers can define types that represent the intents supported in a natural language application. This could be as simple as an interface to categorize sentiment or more complex, such as types for a shopping cart or music application.

After the developer defines the types, TypeChat constructs a prompt to the LLM using those types and validates that the LLM response conforms to the schema. If validation fails, further language model interaction is used to repair the non-conforming output. TypeChat also summarizes the instance and confirms that it aligns with user intent.

Developers can install TypeChat through NPM:

npm install typechat

TypeChat also can be built from source:

npm run build

Elaborating on TypeChat, its creators said that the recent “rush of excitement” around LLMs has raised many questions for developers. While chat assistants have been the most direct application, there have been questions regarding how to integrate these models into existing app interfaces, such as how to augment traditional UIs with natural language interfaces and how to use AI to convert a user request into a form that apps can operate on. TypeChat is intended to answer these questions.

Copyright © 2023 IDG Communications, Inc.



READ SOURCE