python-tgpt
>>> from pytgpt.leo import LEO
>>> bot = LEO()
>>> bot.chat('Hello there')
" Hello! It's nice to meet you. Is there something I can help you with or would you like to chat?"
>>>
This project enables seamless interaction with over 45 free LLMs without requiring an API Key.
The name python-tgpt draws inspiration from its parent project tgpt, which operates on Golang. Through this Python adaptation, users can effortlessly engage with a number of free LLMs available, fostering a smoother AI interaction experience.
Features
- 🗨️ Enhanced conversational chat experience
- 💾 Capability to save prompts and responses (Conversation)
- 🔄 Ability to load previous conversations
- ⌨️ Command-line interface
- 🐍 Python package
- 🌊 Stream and non-stream response
- 🚀 Ready to use (No API key required)
- ⛓️ Chained requests via proxy
- 🤖 Pass awesome-chatgpt prompts easily
- 🧠 Multiple LLM providers – 45+
Providers
These are simply the hosts of the LLMs, which include:
- Leo – Brave
- FakeOpen
- Koboldai
- OpenGPTs
- OpenAI (API key required)
- WebChatGPT – OpenAI (Session ID required)
- Bard – Google (Session ID required)
41+ Other models proudly offered by gpt4free.
All models. (Include not working)
1 AItianhu
2 AItianhuSpace
3 Acytoo
4 AiAsk
5 AiChatOnline
6 AiChatting
7 AiService
8 Aibn
9 Aichat
10 Ails
11 Aivvm
12 AsyncGeneratorProvider
13 AsyncProvider
14 Aura
15 Bard
16 BaseProvider
17 Berlin
18 Bestim
19 Bing
20 ChatAiGpt
21 ChatAnywhere
22 ChatBase
23 ChatForAi
24 Chatgpt4Online
25 ChatgptAi
26 ChatgptDemo
27 ChatgptDemoAi
28 ChatgptDuo
29 ChatgptFree
30 ChatgptLogin
31 ChatgptNext
32 ChatgptX
33 Chatxyz
34 CodeLinkAva
35 CreateImagesProvider
36 Cromicle
37 DeepInfra
38 DfeHub
39 EasyChat
40 Equing
41 FakeGpt
42 FastGpt
43 Forefront
44 FreeChatgpt
45 FreeGpt
46 GPTalk
47 GeekGpt
48 GeminiProChat
49 GetGpt
50 Gpt6
51 GptChatly
52 GptForLove
53 GptGo
54 GptGod
55 GptTalkRu
56 H2o
57 Hashnode
58 HuggingChat
59 Koala
60 Komo
61 Liaobots
62 Llama2
63 Lockchat
64 MikuChat
65 MyShell
66 Myshell
67 OnlineGpt
68 Opchatgpts
69 OpenAssistant
70 OpenaiChat
71 PerplexityAi
72 Phind
73 Pi
74 Poe
75 Raycast
76 RetryProvider
77 TalkAi
78 Theb
79 ThebApi
80 V50
81 Vercel
82 Vitalentum
83 Wewordle
84 Wuguokai
85 Ylokh
86 You
87 Yqcloud
Prerequisites
Installation and Usage
Installation
Download binaries for your system from here.
Alternatively, you can install non-binaries. (Recommended)
Choose one of the following methods to get started.
-
From PyPI:
pip install --upgrade python-tgpt
-
Directly from the source:
pip install git+https://github.com/Simatwa/python-tgpt.git
-
Clone and Install:
git clone https://github.com/Simatwa/python-tgpt.git cd python-tgpt pip install .
Usage
This package offers a convenient command-line interface.
Note :
Aura
is the default provider.
Make use of flag --provider
postfixed with the provider name of your choice. e.g --provider koboldai
You can also simply use pytgpt
instead of python -m pytgpt
.
Starting from version 0.2.7, running $ pytgpt
without any other command or option will automatically enter the interactive
mode. Otherwise, you’ll need to explicitly declare the desired action, for example, by running $ pytgpt generate
.
Developer Docs
- Generate a quick response
from pytgpt.leo import LEO
bot = LEO()
resp = bot.chat('<Your prompt>')
print(resp)
# Output : How may I help you.
- Get back whole response
from pytgpt.leo import LEO
bot = LEO()
resp = bot.ask('<Your Prompt')
print(resp)
# Output
"""
{'completion': "I'm so excited to share with you the incredible experiences...", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwJ2', 'exception': None}
"""
Stream Response
Just add parameter stream
with value true
.
- Text Generated only
from pytgpt.leo import LEO
bot = LEO()
resp = bot.chat('<Your prompt>', stream=True)
for value in resp:
print(value)
# output
"""
How may
How may I help
How may I help you
How may I help you today?
"""
- Whole Response
from pytgpt.leo import LEO
bot = LEO()
resp = bot.ask('<Your Prompt>', stream=True)
for value in resp:
print(value)
# Output
"""
{'completion': "I'm so", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
{'completion': "I'm so excited to share with.", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
{'completion': "I'm so excited to share with you the incredible ", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
{'completion': "I'm so excited to share with you the incredible experiences...", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
"""
Note : All providers have got a common class methods.
Openai
import pytgpt.openai as openai
bot = openai.OPENAI("<OPENAI-API-KEY>")
print(bot.chat("<Your-prompt>"))
Koboldai
import pytgpt.koboldai as koboldai
bot = koboldai.KOBOLDAI()
print(bot.chat("<Your-prompt>"))
Fakeopen
import pytgpt.fakeopen as fakeopen
bot = fakeopen.FAKEOPEN()
print(bot.chat("<Your-prompt>"))
Opengpt
import pytgpt.opengpt as opengpt
bot = opengpt.OPENGPT()
print(bot.chat("<Your-prompt>"))
Bard
import pytgpt.bard as bard
bot = bard.BARD('<Path-to-bard.google.com.cookies.json>')
print(bot.chat("<Your-prompt>"))
Gpt4free providers
import pytgpt.gpt4free as gpt4free
bot = gpt4free.GPT4FREE(provider="Aura")
print(bot.chat("<Your-prompt>"))
To obtain more tailored responses, consider utilizing optimizers using the optimizer
parameter. Its values can be set to either code
or system_command
.
optimizer
parameter. Its values can be set to either code
or system_command
.from pytgpt.leo import LEO
bot = LEO()
resp = bot.ask('<Your Prompt>', optimizer='code')
print(resp)
Note: Commencing from v0.1.0, the default mode of interaction is conversational. This mode enhances the interactive experience, offering better control over the chat history. By associating previous prompts and responses, it tailors conversations for a more engaging experience.
You can still disable the mode:
bot = koboldai.KOBOLDAI(is_conversation=False)
Utilize the --disable-conversation
flag in the console to achieve the same functionality.
Warning : Bard and WebChatGPT autohandles context due to the obvious reason; the
is_conversation
parameter is not necessary at all hence not required when initializing the respective classes. Also be informed that majority of providers offered by gpt4free requires Google Chrome inorder to function.
Advanced Usage of Placeholders
The generate
functionality has been enhanced starting from v0.3.0 to enable comprehensive utilization of the --with-copied
option and support for accepting piped inputs. This improvement introduces placeholders, offering dynamic values for more versatile interactions.
Placeholder | Represents |
---|---|
{{stream}} |
The piped input |
{{copied}} |
The last copied text |
This feature is particularly beneficial for intricate operations. For example:
$ git diff | pytgpt generate "Here is a diff file: {{stream}} Make a concise commit message from it, aligning with my commit message history: {{copied}}" -p fakeopen --with-copied --shell --new
In this illustration,
{{stream}}
denotes the result of the$ git diff
operation, while{{copied}}
signifies the content copied from the output of the$ git log
command.
For more usage info run $ pytgpt --help
$ pytgpt --help
Usage: pytgpt [OPTIONS] COMMAND [ARGS]...
Options:
-v, --version Show the version and exit.
-h, --help Show this message and exit.
Commands:
awesome Perform CRUD operations on awesome-prompts
generate Generate a quick response with AI
gpt4free Discover gpt4free models, providers etc
interactive Chat with AI interactively (Default)
utils Utility endpoint for pytgpt
webchatgpt Reverse Engineered ChatGPT Web-Version
## [CHANGELOG](https://github.com/Simatwa/python-tgpt/blob/main/docs/CHANGELOG.md)