Forum in maintenance, we will back soon π
Using NlpCloud model
Hello,Β
I'veΒ a question about connecting to NlpCloud using this function below
def llm_generate_text(prompt, service, model): if service == 'OpenAI': generated_text = openai_generate(prompt, model) elif service == 'NlpCloud': generated_text = nlp_cloud_generate(prompt, model) elif service == 'Cohere': generated_text = cohere_generate(prompt, model) return generated_text
I ran this function on OpenAI and Cohere modules ssuccessfully
but when i tried ran it on NlpCloud i got this error message:
Traceback (most recent call last):
File "D:\Users\hbenmoshe\promptEngineering\main.py", line 10, in <module>
open_ai_response = llm.llm_generate_text(prompt,"NlpCloud","nllb-200-3-3b")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Users\hbenmoshe\promptEngineering\llm.py", line 18, in llm_generate_text
generated_text = nlp_cloud_generate(prompt, model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Users\hbenmoshe\promptEngineering\llm.py", line 45, in nlp_cloud_generate
result = client.generation(
^^^^^^^^^^^^^^^^^^
TypeError: Client.generation() got an unexpected keyword argument 'min_length'
Also I asked ChatGPT what is the problem? and how to solved it?, and basically his answer was to remove 'min_length'
def nlp_cloud_generate(user_prompt, selected_model):
client = nlpcloud.Client(selected_model, nlp_cloud_key, gpu=True, lang="en")
result = client.generation(
user_prompt,
min_length=0,
length_penalty=1,
do_sample=True,
early_stopping=False,
num_beams=1,
no_repeat_ngram_size=0,
I checked NLPCloud New codes, and here is a snippet from there website:
import nlpcloud client = nlpcloud.Client("finetuned-llama-2-70b", "f17207e55bf7c65f8294e0bec843b3bc2102ddf9", gpu=True) client.generation( """""", max_length=50, length_no_input=True, remove_input=True, end_sequence=None, top_p=1, temperature=0.8, top_k=50, repetition_penalty=1, num_beams=1, num_return_sequences=1, bad_words=None, remove_end_sequence=False )
try it, maybe they removed the minimum length with there new update.
I also updated the codes in the course
@admin Dear Hasan, I really appreciate the fact that you not only "sell" the course, but actually act as a mentor who accompanies the course and answers any question, suggestion or idea. I wanted to ask, when you update the course, a code, a specific lesson or an instruction... could you update an update date at the top of the page or the code (even with a note). Because I believe that the teaching process requires repetition of the material. Thanks
@hagai-ben-moshe make sure you follow Hasan on Github.
Regards,
Earnie Boyd, CEO
Seasoned Solutions Advisor LLC
Schedule 1-on-1 help
Join me on Slack
@hagai-ben-moshe Thank you!
Which page you are pointing to? The course page? Or the code page?
I didn't understand this sentence: "Because I believe that the teaching process requires repetition of the material." Can you please clarify for me?Β
@admin I think he means that he learns best by watching the video multiple times but wants to know when the video and/or code for the video changed. I gave him the link to your Github profile so he could be notified of when the repositories change.
Regards,
Earnie Boyd, CEO
Seasoned Solutions Advisor LLC
Schedule 1-on-1 help
Join me on Slack
Hi,
Ssadvisor right!
This is what exactly was my intention
Hello all,
Β
I ran into the same problem as @hagai-ben-moshe when trying out NlpCloud. Thought that I will do a quick search before posting new thread.
Β
Just a heads-up @admin Hasan, the codes provided at https://learnwithhasan.com/lessons/llm-generic-function/ is not updated yet. New learners who want to try out NlpCloud will get the TypeError.
Β
Recommending a solution for Python newbies like me - replace the entire #nlpCloud Function with the script below:
#nlpCloud Function nlp_cloud_key = "use-your-own-API" def nlp_cloud_generate(user_prompt,selected_model): client = nlpcloud.Client(selected_model, nlp_cloud_key, gpu=True, lang="en") result = client.generation( user_prompt, max_length=200, length_no_input=True, remove_input=True, end_sequence=None, top_p=1, temperature=0.8, top_k=50, repetition_penalty=1, num_beams=1, num_return_sequences=1, bad_words=None, remove_end_sequence=False ) return result["generated_text"]
With this, I was able to try out NlpCloud's 3 models - dolphin, chatdolphinΒ and finetuned-llama-2-70b.
Β
Cheers
@dfby999 Thanks for the update.
Regards,
Earnie Boyd, CEO
Seasoned Solutions Advisor LLC
Schedule 1-on-1 help
Join me on Slack