banner
andrewji8

Being towards death

Heed not to the tree-rustling and leaf-lashing rain, Why not stroll along, whistle and sing under its rein. Lighter and better suited than horses are straw sandals and a bamboo staff, Who's afraid? A palm-leaf plaited cape provides enough to misty weather in life sustain. A thorny spring breeze sobers up the spirit, I feel a slight chill, The setting sun over the mountain offers greetings still. Looking back over the bleak passage survived, The return in time Shall not be affected by windswept rain or shine.
telegram
twitter
github

OpenAI's open source alternative, self-deploying large models without the need for GPUs.

Free and open-source alternative to OpenAI. Self-hosted, community-driven, local-first. A direct replacement for OpenAI running on consumer-grade hardware. No GPU required. Run ggml, gguf, GPTQ, onnx, TF compatible models: llama, llama2, rwkv, whisper, vicuna, koala, cerebras, falcon, dolly, starcoder, etc.

Official website: https://localai.io/
GitHub address: https://github.com/mudler/LocalAI

Mirror in China (translated version): http://www.gitpp.com/llm/localai-cn

LocalAI is a free and open-source alternative to OpenAI. It serves as a direct replacement for the REST API and is compatible with the local inference OpenAI API specification. It allows you to run LLM, generate images, audio (and more) locally or on-premises using consumer-grade hardware, supporting multiple model series. No GPU required.

LocalAI is an open-source project aimed at providing developers with a free and easy-to-use local artificial intelligence solution. It aims to be compatible with OpenAI's API specification, allowing developers to utilize similar functionality without relying on OpenAI's services. LocalAI supports various model series, including language models, image generation models, audio generation models, etc.

Key features of LocalAI include:

  1. Open-source: LocalAI is open-source, meaning its source code can be freely viewed, modified, and distributed by anyone.

  2. Free to use: LocalAI provides free usage permissions, allowing developers to use its functionality without cost.

  3. Local inference: LocalAI supports inference on local hardware, eliminating the need to connect to cloud services or use remote servers.

  4. Consumer-grade hardware: LocalAI can run on consumer-grade hardware without requiring high-performance GPUs or special hardware support.

  5. Model compatibility: LocalAI supports multiple model series, allowing developers to choose the appropriate model based on their needs.

  6. API compatibility: LocalAI's interface design is compatible with OpenAI's API specification, making it relatively easy for developers to migrate existing OpenAI code to LocalAI.

The emergence of LocalAI provides an alternative choice for developers who wish to avoid using centralized services or who have privacy and security concerns and prefer to keep their data in a local environment. However, it should be noted that while LocalAI offers some conveniences, it may not fully replace the services provided by OpenAI, especially in terms of model performance and functionality. Developers should evaluate LocalAI based on their own needs and expectations.

Getting Started:
The easiest way to run LocalAI is to use Docker Compose or Docker (to build locally, see the Build section).

LocalAI requires at least one model file or a configuration YAML file, or both. You can customize further model defaults and specific settings using the configuration file.

Container image requirements:

  • Docker or Podman, or a container engine

To build LocalAI, you can use the container image locally with Docker, for example:

# build the image
docker build -t localai .
docker run localai

Local:
To build LocalAI locally, you need to meet the following requirements:

  • Golang >= 1.21
  • Cmake/make
  • GCC
  • GRPC

To build LocalAI using the following command:

git clone https://github.com/go-skynet/LocalAI
cd LocalAI
make build
Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.