-
-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to import custom data/documents and ask questions based on those data ? #132
Comments
Hi @revskill10 👋 We have two full e2e example here now: https://github.com/go-skynet/LocalAI/tree/master/examples/langchain-chroma and https://github.com/go-skynet/LocalAI/tree/master/examples/query_data and a blog post over here: https://mudler.pm/posts/localai-question-answering/ Cheers! |
Hi, I know this is closed but i think it's the right place to show what happens when i try the query-data example. Sorry in advance if i misunderstood something.
Run the container:
So i fixed the env issue in docker-compose:
This way the container runs. But then when i load a document in the data directory and try to run store.py i encounter another problem Traceback (most recent call last):
File "/home/g/LocalAI/examples/query_data/store.py", line 6, in <module>
from llama_index import GPTVectorStoreIndex, SimpleDirectoryReader, LLMPredictor, PromptHelper, ServiceContext
ModuleNotFoundError: No module named 'llama_index' So i had to pip install:
And try again:
So i git pull, just in case I'm missing on recent changes, but the result did not change. Any advise? the document is a single PDF file... Tx! |
just edit the code, change chunk_overlap_ratio to something between 0 and 1.. I suppose this was changed from 0..100 to 0..1 recently.. then you probably have to "pip install sentence_transformers" |
Sorry, i'm new in the ChatAI models.
My questions is, if i have a bunch of markdown documents, could i import those documents to ask questions based on this repository and chatbot ui ?
The text was updated successfully, but these errors were encountered: