You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I understand that issues are for feedback and problem solving, not for complaining in the comment section, and will provide as much information as possible to help solve the problem.
I've looked at pinned issues and searched for existing Open Issues, Closed Issues, and Discussions, no similar issue or discussion was found.
I've filled in short, clear headings so that developers can quickly identify a rough idea of what to expect when flipping through the list of issues. And not "a suggestion", "stuck", etc.
I've confirmed that I am using the latest version of Cherry Studio.
Platform
macOS
Version
v1.8.4
Bug Description
Using local ollama nomic-embed-text model prompted: Embedding failed: Cannot read properties of undefined (reading '0')
I tried using the local CLI after encountering this problem, and it worked without issues
Using local ollama inference model deepseek-r1:8b in Cherry Studio runs normally
I also tried using OpenRouter's embedding model and it had no issues
Steps To Reproduce
Steps have been included in the details above
Expected Behavior
Expect to be able to use ollama embedding model normally, web search found no answer to this problem
Relevant Log Output
Additional Context
No response
Original Content
Issue Checklist
I understand that issues are for feedback and problem solving, not for complaining in the comment section, and will provide as much information as possible to help solve the problem.
I've looked at pinned issues and searched for existing Open Issues, Closed Issues, and Discussions, no similar issue or discussion was found.
I've filled in short, clear headings so that developers can quickly identify a rough idea of what to expect when flipping through the list of issues. And not "a suggestion", "stuck", etc.
I've confirmed that I am using the latest version of Cherry Studio.
Note
This issue was translated by Claude.
Issue Checklist
Platform
macOS
Version
v1.8.4
Bug Description
I tried using the local CLI after encountering this problem, and it worked without issues

Using local ollama inference model deepseek-r1:8b in Cherry Studio runs normally

I also tried using OpenRouter's embedding model and it had no issues

Steps To Reproduce
Steps have been included in the details above
Expected Behavior
Expect to be able to use ollama embedding model normally, web search found no answer to this problem
Relevant Log Output
Additional Context
No response
Original Content
Issue Checklist
Platform
macOS
Version
v1.8.4
Bug Description
遇到这个问题我尝试使用了本地cli执行没有问题

在cherry studio使用本地ollama推理模型deepseek-r1:8b正常运行

也尝试了使用Openrouter的嵌入模型也是没问题的

Steps To Reproduce
步骤已经附在详情中
Expected Behavior
能正常使用ollama嵌入模型,网络搜索没有该问题的答案
Relevant Log Output
Additional Context
No response