@mxmatt38 Make sure both devices are compatible with the application or service you're using.
Check if there are any specific requirements for the connection, such as both devices being on the same network or having certain permissions enabled.
Latest posts made by rikazkhan7
-
RE: My flow Not workingOpera for Windows
- Lounge
-
RE: Local LLM not working?Feedback
@ramarianhaertel Troubleshooting Gemma and Mistral
- No Reply from Local LLMs
If you're not receiving any response from Gemma and Mistral after entering input, there could be a few reasons for this:
Installation Issues: Double-check that the installations were successful and that there were no errors during the process. Sometimes, installation issues can prevent the models from running correctly.
Environment Setup: Ensure that your environment is set up properly to run the local LLMs. This includes having the necessary dependencies installed and configured correctly.
Input Format: Make sure you're providing input in the correct format expected by the models. Depending on how Gemma and Mistral are designed, they might require input in a specific format or structure. - No Reply from Local LLMs