Enable local ai models in chat button isn't working
-
fialamiri last edited by
Hi. I have a primitive problem. I installed opera developer version and tried to download some local ai models, but the button for enabling that feature doesn't work. When i click on it doesn't do anything. It does the same on both linux and windows. Can someone please help me?
-
CalDubs last edited by
@fialamiri Same here on a Mac.
In my case, I was able to reach the initial Aria settings menu once to select local AI models to download, but it seems like when I switched to another application at this point and switched back (I may have also restarted Opera at this point), Aria settings seems to not let me browse local AI models to download anymore nor enable the toggle to enable local AI model usage like you're saying.
Maybe there's an unexpected conflict in which Aria settings requires a local AI model to be downloaded and available before we can enable the toggle for local AI model usage, but we are no longer given the option to download local AI models so consequently we're stuck in this state.
-
CalDubs last edited by
@fialamiri In case an Opera dev can help us, I will also add that there are errors related to Aria reported in the console log that might be relevant to our issue:
-
brdrbjvs last edited by leocg
@caldubs said in Enable local ai models in chat button isn't working:
... we're stuck in this state.
I'm also completely stuck in this state.
-
-
CalDubs last edited by
I logged a Jira bug for this issue, to which the Opera Aria QA Team replied on April 25th PT with the following:
Dear User,
We are aware of this issue. Currently we are working to solve this problem. As a walk around you can try to quit >ollama.exe process. This should solve your problem for a while.
Bests regards,
Opera Aria QA Team
So it seems they are at least aware and working on fixing the issue.
The suggested workaround of exiting the ollama process didn't change anything for me, but I'm using a Mac while the workaround read as Windows-oriented.
-
CalDubs last edited by
Looks like the Opera Developer team fixed the issue since I now see "Search local AI model" in a search field I can use to search and download local AI models.
After downloading a local AI model, you have to select it in your Chats from a little chiclet/dropdown at the top.
Thanks Opera Dev team!
-
CalDubs last edited by
@caldubs said in Enable local ai models in chat button isn't working:
Looks like the Opera Developer team fixed the issue since I now see "Search local AI model" in a search field I can use to search and download local AI models.
After downloading a local AI model, you have to select it in your Chats from a little chiclet/dropdown at the top.
Thanks Opera Dev team!
Warning: sometimes I seem to revert back to the same issue state as before, but killing the "ollama" process and restarting Opera Dev seems to help. It could be related to a timing issue, but after clicking around (like on the toggle and on the "Feedback" button) in the issue state, I may see the search section re-appear.
At least it's a step in the right direction, but I'm definitely hope this experience gets polished up since it has a great potential for adoption for normal folks who wouldn't even know what "ollama" is (as an example).