[ Addon Support ] AnkiTerminator : ChatGPT Sidebar for Anki Review, GoogleBard, BingChat (by Shige)

Hi, do you mean that Anki won’t start or that this add-on won’t start? The latest Anki launcher uses uv to run Anki in Python, so it might be needed to specify Python?

1 Like

hi, i am still on win10, qt6. anki:

Version ⁨23.12.1 (1a1d4d54)⁩
Python 3.9.15 Qt 6.6.1 PyQt 6.6.1

btw, i only know my addon is not the latest (seems i installed on 2025-aug3),

i didn’t found any info on version number, this config file is the only info i know about the version of ur addon installed:

{
“ChatGPT_URL” :{ “Chat_GPT”:“https://chat.openai.com/chat/”,
“Bing_Chat”:“Microsoft Copilot: Your AI companion,
“Google_Bard” : “https://gemini.google.com/app”,
“DeepSeek”: “https://chat.deepseek.com/”
},

"Custom_Ai" : false,
"Custom_AI_URL" : "",

"now_AI_type" : "Chat_GPT",

"button_name" : ["default","more","baby",
                "origin","joke","history",
                "synonym","mnemonic"],

"random_prompt": ["what is {}? Please tell me in detail."],

"exclusion_list" :["Image Occlusion","","","","","","",""],

"Priority_Fields_list" :["","","","","","","",""],
"Priority_tag_list" : ["","","","","","","",""],

"more_info" : ["Please explain more about {}."],
"baby_explanation" : ["What is {}? Explain like I'm 5 years old."],
"word_origin" : ["What is the origin of {}?"],
"make_joke" : ["Tell me funny jokes for {}."],
"history" : ["What is the history of {}?"],
"synonym" : ["What are the synonyms for {}?"],

"is_i_am_studying" : true,
"i_am_studying":["I am studying {}. "],

"Enter_Short_cut_Key" : "Ctrl+H",
"AI_shortcut_key" : "Ctrl+G",

"change_Language":true,
"language" : ["Please explain in {}."],
"mnemonic" : ["Please make {} mnemonic."],

"start_up" : true,
"add_gpt_to_the_top_toolbar" : true,

"submit_text" : false,

"hide_the_sidebar_on_the_answer_screen" : false,

"EffectVolume" : 0.5,

"last_tab": 0,

"is_change_log_2024_06_05": false

}

1 Like

looks like only route anki.exe thru the proxy is not enough, i ran it in sandboxie i found there is also QtWebEngineProcess.exe and may be others. i’ll troubleshoot it first. thx

ps: looks like i have to test out how to use anki-alone thru proxy first,

then see how to use with this addon, thanks.

https://forums.ankiweb.net/t/anki-behind-proxy/8501/7

1 Like

I didn’t add version info, so there is none. I think the “mod” number in meta.json is probably the download date. e.g. Epoch & Unix Timestamp Conversion Tools

I don’t use VPN much so I’m not familiar with it, but in some countries AI is banned and some users say they can’t use AI as you say, so I think connecting with a VPN is a useful way.

Hi,

Today I revisited the add-on and found that not only is the LLM’s output helpful, it makes Anki more “alive,” which increased my interest in doing the cards. Good work.

However, if I use Proxifier to force anki.exe to go through a proxy (similar to a VPN), when I edit cards, it shows an error connecting to 127.0.0.1. I think this use of a proxy by forcing anki.exe (on Win10) breaks Anki. I can now only use DeepSeek; although Grok doesn’t need a VPN, the right-click “send to AnkiTerminator” didn’t work for SuperGrok, at least not for me to date. It appear in the top bar but not in the prompt window.

To me, the solution would be for ONLY the LLM part to go through the proxy/VPN. It will be hard to implement VPN support with this add-on, but traditionally HTTP/HTTPS/SOCKS5 proxy support would be easier, as Python has built-in support.

I would appreciate it if the author could consider this. Otherwise, with vibe coding, I could do it later when I have time. I only wish that if I make it work, it could be incorporated so I don’t have to modify it myself for every future version.

Please advise, thank you.

1 Like

No need feel sorry to me, i myself have subscriptions to ALL the service i want:

chatgpt edu, claude pro, supergrok, gemini 3, deepseek. it’s just troublesome.

btw, i asked for claude’s comment.

are my claims reasonable?

Yes, your claims are technically reasonable and well-founded. Let me break down why:

✅ Valid Technical Points
Proxifier breaking localhost connections (127.0.0.1)
Reasonable: Proxifier forces ALL connections from anki.exe through the proxy, including internal localhost calls that the add-on likely uses for LLM API communication
This is a known issue when using system-wide proxy enforcers with applications that have internal IPC (inter-process communication)
Solution: Selective proxying for LLM traffic only
Reasonable: This is the standard approach. You want:
Local Anki operations → Direct connection
External LLM API calls → Through proxy/VPN
This is exactly how most well-designed applications handle proxy requirements
HTTP/HTTPS/SOCKS5 easier than VPN implementation
Reasonable: Python's requests library and urllib have native proxy support:
python
     proxies = {
         'http': 'http://127.0.0.1:port',
         'https': 'http://127.0.0.1:port'
     }
     response = requests.get(url, proxies=proxies)
VPN would require OS-level routing changes, much more complex
Grok/SuperGrok right-click not working
Possibly reasonable: Could be:
API endpoint differences
Authentication method incompatibility
Rate limiting or geographic restrictions
Hard to confirm without testing, but plausible
🤔 Considerations
Your maintenance concern is valid - if the add-on updates frequently, maintaining your own fork is tedious
Upstream contribution is the right approach - offering your code back to the author benefits everyone
💡 Suggestion
When you implement this, consider making it configuration-based:

python
# In add-on config
"proxy_settings": {
    "enabled": false,
    "http": "http://127.0.0.1:7890",
    "https": "http://127.0.0.1:7890",
    "socks5": "socks5://127.0.0.1:1080"
}
This way users can toggle it on/off without code changes.

Your technical reasoning is sound! 👍

Hi krstoevan! The VPN functionality seems difficult to develop so I’m not sure if it’s feasible. For now to make it stable I plan to add features using each official API.

1 Like

Hi, as mentioned above, implementing VPN support is complex, but adding Python proxy support would be straightforward.

Would you consider adding it as an optional feature? I could try vibe coding it later if i have time.

  • Users outside restricted regions can simply leave it disabled in config

  • Won’t affect anyone who doesn’t need it

Even users with only VPN access could run an OpenWrt VM (Hyper-V/VMware) to convert VPN to proxy, so everyone benefits.

Thanks for considering!

btw, is it the API part is to avoid the web interface change (quite freq) which prevent the addon from suddenly dont work?

Yes, that is the purpose of the API.

The problem is probably that I can’t debug it and I don’t use it. Since the connection already works on my device, I can’t test whether it connects or not, also even if I develop it I won’t use it.

So far I failed to develop add-ons with such features. e.g. previously I fixed an add-on for native Chinese speakers learning English, but since I’m not a Chinese speaker, I don’t use this add-on and debugging is too difficult, so it broke again soon after. Finally another version fixed by a Chinese developer is working well.

So I think add-on features need to be features that developers actually use, developing unused features is difficult to understand and lacks motivation, so it doesn’t work well.

1 Like

Hi,

proxifier worked! for those from evil axis like me,

in proxifier, set “destination” of claude.ai, gemini.google.com; then this simply fwd the LLM part to the proxy; while let the anki local part work.

however, my chatgpt is some sort of service from US university; that chatgpt.com and openai.com not enough to let me thru. i’ll have to figure out later.

@Shigeyuki sort of solved. thanks

TLDR:

below is my waste of a nite:

tried claude code + claude/deepseek, then codex and gemini.

failed from 11p to 3a; then around 3a, gemini 3 pro say yeah it’s hard to modify the code becoz

you simply cant split the local part to direct connection and the LLM part to proxy. Then gemini 3 suggested some local proxy; i said other LLMs say proxifier wont work. and gemini 3 say other LLMs are wrong. it will work. And yes, it worked. Gemini 3 is the best LLM at this moment.

claude is the most anti evil axis AI company. this is a proof of work. although chatgpt not working yet.

1 Like

ok, everything worked. for chatgpt, one need to fwd chatgpt.com, auth.openai.com; openai.com and the US university if you are using their chatgpt subscription.
i am not sure why openai.com fwd but auth.openai.com need fwd again. anyway wont hurt.

thanks

1 Like

Hii. Great Anki add on :). Couple things. If I want to copy over a chart from chat gpt, using the built in copy and right click on which field I want the chart/text to appear in, the chart is not formatted right. If I copy the chart and edit the card and paste it manually into the field, the chart will be formatted right.

Another thing would be having the ability to automatically copy over chat gpt’s output into a designated field, without me manually having to copy the text and clicking on which field I want it in. Thank you so much! really appreciate this amazing add on!

1 Like

In such cases I instruct the AI to output the content in HTML format and then copy that HTML.

Or I take a partial screenshot convert it to an image and add it to the card. Take a partial screenshot:

  • Win: Windows + Shift + S
  • Mac: Command + Shift + 4

That feature is relatively difficult to develop for now (it’s fragile), it may be possible in the future.