Location>code7788 >text

Pantheons: A one-stop integrated library for mainstream big model conversations using TypeScript

Popularity:469 ℃/2025-03-05 08:44:29

Pantheons: Use TypeScript to create a one-stop integrated library for mainstream big model dialogue

Preface

In an era of rapid development of AI, large language models (LLMs) have gradually become an important force in promoting technological progress. Whether it is natural language processing, text generation, intelligent question-and-answer and code assistance, the application scenarios of LLMs are constantly expanding, and various models are emerging one after another. However, facing a wide variety of models and their own different interface standards, developers often face complexity and compatibility issues when integrating and managing these models.

PantheonsIt was born to solve this pain point. It is a unified dialogue library built using TypeScript based on OpenAI SDK. It aims to provide developers with a simple and efficient interface to facilitate interaction with multiple large language models (LLMs). Through Pantheons, developers can easily integrate mainstream language models such as OpenAI, DeepSeek, DashScope, Gemini, etc., without worrying about underlying differences, and focusing on implementing their own business logic.

Functional Features

  • Unified interface design: All models are built based on OpenAI SDK; share the same calling method to significantly reduce learning costs
  • Type Safety: Built based on TypeScript, providing complete type definitions to make development smoother
  • Supports multiple models: Currently, it supports more than a dozen mainstream large language models, including OpenAI, Azure OpenAI, Tongyi Qianwen, Wenxin Yiyan, Tencent Hunyuan, Google Gemini, etc., covering almost all mainstream cloud and local LLM services.
  • Adaptable to multiple runtime environments: supports multiple runtime environments such as , Bun and Web, and has strong adaptability.

Supported large models

  • OpenAI
  • Azure OpenAI
  • DashScope
  • Tencent HunYuan
  • Moonshot
  • SiliconFlow
  • DeepSeek
  • Wen Xin Yi Ye (Qian Fan)
  • Gemini
  • Ollama
  • Zhipu Qingyan (ZhiPu)
  • XAI
  • Zero One All Things (LingYiWanWu)
  • MiniMax
  • iFLYTEK Spark
  • Anthropic(Claude)

How to use

Nodejs


import { DeepSeek } from 'pantheons';

(async () => {
  const client = new DeepSeek('Your key');
  const stream = await ({
    model: 'deepseek-chat',
    stream: true,
    messages: [{ role: 'user', content: 'Hi!' }],
  });

  let result = '';
  for await (const chunk of stream) {
    result += [0].delta?.content;
  }

  (result);
})();

Bun


import { DeepSeek } from '@greywen/pantheons';

const client = new DeepSeek('Your key');
const stream = await ({
  model: 'deepseek-chat',
  stream: true,
  messages: [{ role: 'user', content: 'Hi!' }],
});

let result = '';
for await (const chunk of stream) {
  result += [0].delta?.content;
}

(result);

Multi-model


import { DashScope, Moonshot, DeepSeek } from 'pantheons';

const deepSeekClient = new DeepSeek('Your key');
const dashScopeClient = new DashScope('Your key');
const moonshotClient = new Moonshot('Your key');

const messages = [{ role: 'user', content: 'Hi!' }];

const deepSeekStream = await ({
  model: 'deepseek-chat',
  stream: true,
  messages,
});

const dashScopeStream = await ({
  model: 'qwen-max',
  stream: true,
  messages,
});

const moonshotStream = await ({
  model: 'kimi-latest',
  stream: true,
  messages,
});

async function readStream(stream: AsyncIterable<any>, output: string[]) {
  for await (const chunk of stream) {
    const content = [0].delta?.content || '';
    (content);
  }
}

const deepSeekOutput: string = [];
const dashScopeOutput: string[] = [];
const moonshotOutput: string[] = [];

await ([
  readStream(deepSeekStream, deepSeekOutput),
  readStream(dashScopeStream, dashScopeOutput),
  readStream(moonshotStream, moonshotOutput),
]);

(deepSeekOutput, dashScopeOutput, moonshotOutput);

Summarize

PantheonsIt is a multi-model integration tool. With its unified and efficient interface design, developers can significantly reduce the development costs and time in multi-language model integration. Whether you want to quickly access a model or need to switch between multiple models freely, Pantheons can be an indispensable tool for you.

In the future, Pantheons will continue to expand support for more models, while optimizing performance and ease of use, providing developers with a stronger toolchain. If you are looking for a solution to the pain points of multi-model integration, try itPantheons

Developers are welcome to Star, submit issue, contribute code, and participate in discussions. Thank you!

At the same time, everyone is also welcome to use our published big model project Sdcb Chats If you find it helpful, please go toGitHubOn Star Us! Your support is our driving force forward.

Thank you again for your support and look forward to bringing you more surprises in the future!