Browse Source

Merge pull request #815 from irexyc/typo

Jacky 3 months ago
parent
commit
8cf7884f74

+ 2 - 2
app/src/language/ar/app.po

@@ -2505,11 +2505,11 @@ msgstr ""
 #: src/views/preference/OpenAISettings.vue:48
 #: src/views/preference/OpenAISettings.vue:48
 #, fuzzy
 #, fuzzy
 msgid ""
 msgid ""
-"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
+"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "local API."
 "local API."
 msgstr ""
 msgstr ""
-"لاستخدام نموذج كبير محلي، قم بنشره باستخدام vllm أو imdeploy. فهي توفر نقطة "
+"لاستخدام نموذج كبير محلي، قم بنشره باستخدام vllm أو lmdeploy. فهي توفر نقطة "
 "نهاية API متوافقة مع OpenAI، لذا قم فقط بتعيين baseUrl إلىAPI المحلية الخاصة "
 "نهاية API متوافقة مع OpenAI، لذا قم فقط بتعيين baseUrl إلىAPI المحلية الخاصة "
 "بك."
 "بك."
 
 

+ 2 - 2
app/src/language/de_DE/app.po

@@ -2657,12 +2657,12 @@ msgstr ""
 
 
 #: src/views/preference/OpenAISettings.vue:48
 #: src/views/preference/OpenAISettings.vue:48
 msgid ""
 msgid ""
-"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
+"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "local API."
 "local API."
 msgstr ""
 msgstr ""
 "Um ein lokales großes Modell zu verwenden, implementiere es mit ollama, vllm "
 "Um ein lokales großes Modell zu verwenden, implementiere es mit ollama, vllm "
-"oder imdeploy. Sie bieten einen OpenAI-kompatiblen API-Endpunkt, also setze "
+"oder lmdeploy. Sie bieten einen OpenAI-kompatiblen API-Endpunkt, also setze "
 "die baseUrl auf deine lokale API."
 "die baseUrl auf deine lokale API."
 
 
 #: src/views/preference/OpenAISettings.vue:72
 #: src/views/preference/OpenAISettings.vue:72

+ 1 - 1
app/src/language/en/app.po

@@ -2598,7 +2598,7 @@ msgstr ""
 
 
 #: src/views/preference/OpenAISettings.vue:48
 #: src/views/preference/OpenAISettings.vue:48
 msgid ""
 msgid ""
-"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
+"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "local API."
 "local API."
 msgstr ""
 msgstr ""

+ 2 - 2
app/src/language/es/app.po

@@ -2579,11 +2579,11 @@ msgstr ""
 #: src/views/preference/OpenAISettings.vue:48
 #: src/views/preference/OpenAISettings.vue:48
 #, fuzzy
 #, fuzzy
 msgid ""
 msgid ""
-"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
+"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "local API."
 "local API."
 msgstr ""
 msgstr ""
-"Para utilizar un modelo local grande, impleméntelo con vllm o imdeploy. "
+"Para utilizar un modelo local grande, impleméntelo con vllm o lmdeploy. "
 "Estos proporcionan un API endpoint compatible con OpenAI, por lo que solo "
 "Estos proporcionan un API endpoint compatible con OpenAI, por lo que solo "
 "debe configurar la baseUrl en su API local."
 "debe configurar la baseUrl en su API local."
 
 

+ 1 - 1
app/src/language/fr_FR/app.po

@@ -2621,7 +2621,7 @@ msgstr ""
 
 
 #: src/views/preference/OpenAISettings.vue:48
 #: src/views/preference/OpenAISettings.vue:48
 msgid ""
 msgid ""
-"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
+"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "local API."
 "local API."
 msgstr ""
 msgstr ""

+ 1 - 1
app/src/language/ko_KR/app.po

@@ -2585,7 +2585,7 @@ msgstr ""
 
 
 #: src/views/preference/OpenAISettings.vue:48
 #: src/views/preference/OpenAISettings.vue:48
 msgid ""
 msgid ""
-"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
+"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "local API."
 "local API."
 msgstr ""
 msgstr ""

+ 1 - 1
app/src/language/messages.pot

@@ -2400,7 +2400,7 @@ msgid "To make sure the certification auto-renewal can work normally, we need to
 msgstr ""
 msgstr ""
 
 
 #: src/views/preference/OpenAISettings.vue:48
 #: src/views/preference/OpenAISettings.vue:48
-msgid "To use a local large model, deploy it with ollama, vllm or imdeploy. They provide an OpenAI-compatible API endpoint, so just set the baseUrl to your local API."
+msgid "To use a local large model, deploy it with ollama, vllm or lmdeploy. They provide an OpenAI-compatible API endpoint, so just set the baseUrl to your local API."
 msgstr ""
 msgstr ""
 
 
 #: src/views/preference/OpenAISettings.vue:72
 #: src/views/preference/OpenAISettings.vue:72

+ 1 - 1
app/src/language/ru_RU/app.po

@@ -2565,7 +2565,7 @@ msgstr ""
 
 
 #: src/views/preference/OpenAISettings.vue:48
 #: src/views/preference/OpenAISettings.vue:48
 msgid ""
 msgid ""
-"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
+"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "local API."
 "local API."
 msgstr ""
 msgstr ""

+ 2 - 2
app/src/language/tr_TR/app.po

@@ -2779,11 +2779,11 @@ msgstr ""
 #: src/views/preference/OpenAISettings.vue:48
 #: src/views/preference/OpenAISettings.vue:48
 #, fuzzy
 #, fuzzy
 msgid ""
 msgid ""
-"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
+"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "local API."
 "local API."
 msgstr ""
 msgstr ""
-"Yerel bir büyük model kullanmak için, vllm veya imdeploy ile dağıtın. OpenAI "
+"Yerel bir büyük model kullanmak için, vllm veya lmdeploy ile dağıtın. OpenAI "
 "uyumlu bir API uç noktası sağlarlar, bu nedenle baseUrl'yi yerel API'nize "
 "uyumlu bir API uç noktası sağlarlar, bu nedenle baseUrl'yi yerel API'nize "
 "ayarlamanız yeterlidir."
 "ayarlamanız yeterlidir."
 
 

+ 1 - 1
app/src/language/vi_VN/app.po

@@ -2619,7 +2619,7 @@ msgstr ""
 
 
 #: src/views/preference/OpenAISettings.vue:48
 #: src/views/preference/OpenAISettings.vue:48
 msgid ""
 msgid ""
-"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
+"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "local API."
 "local API."
 msgstr ""
 msgstr ""

+ 2 - 2
app/src/language/zh_CN/app.po

@@ -2453,11 +2453,11 @@ msgstr ""
 
 
 #: src/views/preference/OpenAISettings.vue:48
 #: src/views/preference/OpenAISettings.vue:48
 msgid ""
 msgid ""
-"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
+"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "local API."
 "local API."
 msgstr ""
 msgstr ""
-"要使用本地大型模型,可使用 ollama、vllm 或 imdeploy 进行部署。它们提供了与 "
+"要使用本地大型模型,可使用 ollama、vllm 或 lmdeploy 进行部署。它们提供了与 "
 "OpenAI 兼容的 API 端点,因此只需将 baseUrl 设置为本地 API 即可。"
 "OpenAI 兼容的 API 端点,因此只需将 baseUrl 设置为本地 API 即可。"
 
 
 #: src/views/preference/OpenAISettings.vue:72
 #: src/views/preference/OpenAISettings.vue:72

+ 1 - 1
app/src/language/zh_TW/app.po

@@ -2504,7 +2504,7 @@ msgstr ""
 
 
 #: src/views/preference/OpenAISettings.vue:48
 #: src/views/preference/OpenAISettings.vue:48
 msgid ""
 msgid ""
-"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
+"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
 "local API."
 "local API."
 msgstr ""
 msgstr ""

+ 1 - 1
app/src/views/preference/OpenAISettings.vue

@@ -45,7 +45,7 @@ const models = shallowRef([
       :validate-status="errors?.openai?.base_url ? 'error' : ''"
       :validate-status="errors?.openai?.base_url ? 'error' : ''"
       :help="errors?.openai?.base_url === 'url'
       :help="errors?.openai?.base_url === 'url'
         ? $gettext('The url is invalid.')
         ? $gettext('The url is invalid.')
-        : $gettext('To use a local large model, deploy it with ollama, vllm or imdeploy. '
+        : $gettext('To use a local large model, deploy it with ollama, vllm or lmdeploy. '
           + 'They provide an OpenAI-compatible API endpoint, so just set the baseUrl to your local API.')"
           + 'They provide an OpenAI-compatible API endpoint, so just set the baseUrl to your local API.')"
     >
     >
       <AInput
       <AInput