# Introduction

The [OffloadGPT API](https://rapidapi.com/microdeploy/api/offloadgpt) is a server-side API client that manages OpenAI ChatGPT API requests.

## How it works

OffloadGPT is an asynchronous API that stores ChatGPT responses on generated permalinks **without the need to wait the OpenAI API response**.

It saves the chat status of every request in order to easily retrieve the conversation or continue it.

The API offers support for the following features:

* Delegates API requests and relieves your server load from busy scripts.
* Instantly generates custom endpoint permalinks for each ChatGPT request.
* Parallel execution of multiple requests without compromising your server load.
* Full compatibility with the official [OpenAI Chat Completion](https://platform.openai.com/docs/api-reference/chat/create) parameters.
* Private and Public access to share conversations with others or ensure chat privacy.
* Real-time storing of Streaming and Asynchronous API responses.
* Notifies request finalization to external webhook URLs with the full processed data.
* Concatenates messages from previous responses using the `from_status_url` param.

## Want to jump right in?

Feeling like an eager beaver? Jump in to the quick start docs and get making your first request:

{% content-ref url="quick-start" %}
[quick-start](https://offloadgpt-docs.microdeploy.com/quick-start)
{% endcontent-ref %}

## Want to deep dive?

Dive a little deeper and start exploring our API reference to get an idea of everything that's possible with the API:

{% content-ref url="reference/api-reference" %}
[api-reference](https://offloadgpt-docs.microdeploy.com/reference/api-reference)
{% endcontent-ref %}


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://offloadgpt-docs.microdeploy.com/introduction.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
