I believe there was a misunderstanding. The Claude3-Sonnet is hosted yet remains free of charge. However, it is not an open-source model. I was considering the possibility of integrating the entire manual into a custom GPT, but I wanted to consult with you first on how you managed to do it. Actually, I began exploring this concept several months ago and even developed a fine-tuned version of GPT-3.5 by converting the manual into question-answer pairs for the training dataset. It appears that this dataset was likely too small, leading me to abandon the fine-tuning process. It seems that retrieval-augmented generation might be the better approach.