2K
0.00084
0.00084
33B
Chat

Guanaco (33B)

The Guanaco-33B is an open-source, high-quality chatbot model developed by finetuning LLaMA on OASST1 using 4-bit QLoRA. It is intended for research purposes.
Try it now

AI Playground

Test all API models in the sandbox environment before you integrate. We provide more than 200 models to integrate into your app.
AI Playground image
Ai models list in playground
Testimonials

Our Clients' Voices

Guanaco (33B)Techflow Logo - Techflow X Webflow Template

Guanaco (33B)

Open-source 33B parameter chatbot, finetuned LLaMA using 4-bit QLoRA.

Basic Information

  • Model Name: Guanaco
  • Developer/Creator: Tim Dettmers
  • Release Date: April 2023
  • Version: 33B
  • Model Type: Text-based LLM

Description

Overview

The Guanaco-33B is an open-source, high-quality chatbot model developed by finetuning the 33B parameter LLaMA mode using 4-bit QLoRA. It is competitive with commercial chatbots like ChatGPT on benchmarks.

Key Features

  • Trained on the multilingual OASST1 dataset, with best performance in high-resource languages
  • Uses LoRA adapters with $r=64$ added to all layers of the base LLaMA model
  • Finetuned using 4-bit QLoRA with NormalFloat4 datatype for the base model and adapters
  • Lightweight adapter-only checkpoints, allowing cheap local experimentation

Intended Use

The Guanaco-33B model is intended for research purposes and may produce problematic outputs. It is available under the Apache 2 license, but requires access to the LLaMA model weights which have additional licensing requirements.

Language Support

The Guanaco-33B model supports multiple languages, with best performance in high-resource languages due to the composition of the OASST1 dataset used for finetuning.

Technical Details

Architecture

The Guanaco-33B model is based on the LLaMA architecture, a Transformer-based language model. LoRA adapters with $r=64$ are added to all layers of the base LLaMA model.

Training Data

The model is finetuned on the OASST1 dataset, a multilingual dataset of open-source assistant conversations. The size and diversity of the dataset allow the model to engage in open-ended conversations on a wide range of topics.

Data Source and Size

The OASST1 dataset used for finetuning contains over 100,000 conversations in multiple languages. The exact size and composition of the dataset are not publicly disclosed.

Knowledge Cutoff

The knowledge cutoff date for the Guanaco-33B model is not publicly available. As an open-source model, it may be updated and improved over time.

Diversity and Bias

The OASST1 dataset used for finetuning is multilingual, which helps to reduce bias and improve the model's ability to handle diverse inputs. However, the dataset composition and potential biases are not fully disclosed.

Performance Metrics

The Guanaco-33B model has been evaluated on several benchmarks, including the Anthropic Chatbot Leaderboard, where it performs competitively with commercial chatbots like ChatGPT and BARD. However, its performance may vary across different languages and tasks not covered by the benchmarks used in its evaluation.

Usage

API Usage Example

Ethical Guidelines

Anthropic has published ethical guidelines for the development and use of the Guanaco-33B model. These guidelines include considerations around transparency, accountability, and the potential for misuse.

License Type

The Guanaco-33B model is available under the Apache 2 license, which allows for commercial and non-commercial use.

Try it now
MODELS

200+ AI Models

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

The Best Growth Choice
for Enterprise

Get API Key