News How to Run ChatGPT on Raspberry Pi or PC

This article is plain wrong. The ChatGPT API is not yet released and the model suggested in the article - "text-davinci-003" is a GPT3 model whereas ChatGPT is a separate GPT3.5 model trained on InstructGPT to follow instructions properly.
 
This article is plain wrong. The ChatGPT API is not yet released and the model suggested in the article - "text-davinci-003" is a GPT3 model whereas ChatGPT is a separate GPT3.5 model trained on InstructGPT to follow instructions properly.
text-davinci-003 is a model used by ChatGPT just as text-curie-001, text-babbage-001 & text-ada-001 but this has nothing to do with the official API as any other front end can submit queries to these data sets and receive a similar response, in the same sense that stable diffusion uses different "models".

You may want to research a bit more on this topic before accusing any articles of being incorrect,.... if anything there's just a small fix needed

the complete code listing contains a syntax error
Python:
except KeyboardInterrupt:
print("\nExiting ChatGPT")

should be
Python:
except KeyboardInterrupt:
   print("\nExiting ChatGPT")
 
Last edited:
After copy and pasting your code and adding my api I get the following error
File "/home/tam/.local/lib/python3.9/site-packages/openai/api_requestor.py", line 680, in _interpret_response_line
raise self.handle_error_response(
openai.error.RateLimitError: You exceeded your current quota, please check your plan and billing details.
Does this actually work on the free version ?
 
Run ChatGPT on Raspberry Pi or PC
This article is not anything to do with running any language model (GPT or otherwise) on a Raspberry Pi or PC. It is merely how to interface with a language model run by a third party.
It would be like titling an article "how to run Doom on a Raspberry Pi" and providing steps on how to display streamed video from Geforce Now.
 
text-davinci-003 is a model used by ChatGPT just as text-curie-001, text-babbage-001 & text-ada-001 but this has nothing to do with the official API as any other front end can submit queries to these data sets and receive a similar response, in the same sense that stable diffusion uses different "models".

You may want to research a bit more on this topic before accusing any articles of being incorrect,.... if anything there's just a small fix needed

the complete code listing contains a syntax error
Python:
except KeyboardInterrupt:
print("\nExiting ChatGPT")

should be
Python:
except KeyboardInterrupt:
   print("\nExiting ChatGPT")

Clickbait anyway. From the Wikipedia article: "...released ChatGPT, which was fine-tuned from a model in the GPT-3.5 series." In other words:

  1. ChatGPT <> GPT3.5, but a fine tuned version instead. The author could have stated the truth, "How to invoke the GPT3.5 API from a Raspberry Pi" in the title, but that wouldn't have reached as many clicks as the popular ChatGPT term, so he decided to inflate the claim instead.
  2. As already commented the uninitiated would be very attracted to the possibility of running ChatGPT on a Raspberry Pi, although anybody with an idea of the size of this language model knows that this would be impossible. Yet, anything is valid if you need to fulfill a quota I guess.
 
What's a fair name for what OP is doing without advertising it as ChatGPT. ChatGPT API? Something else?