Forum in maintenance, we will back soon 🙂
How To Create AI Tools Fast (Less Than 2 Minutes)
Hi Hasan, I am following your Youtube short tutorial " How To Create AI Tools Fast (Less Than 2 Minutes)" you didn't mention any thing about helpers directory that I had to create with the structured_generator.py file in! also I had one issue when I run the script, according to ChatGPT I had to add this code into the structured_generator.py which some parts of the script needs to be replaced and I don't exactly to be replaced with! starting with, your_module and more...! (check the code snap)
@zimbabox please share your code via a public link on Github. Without viewing the entire code it is hard to help you.
Regards,
Earnie Boyd, CEO
Seasoned Solutions Advisor LLC
Schedule 1-on-1 help
Join me on Slack
Hi, there is nothing to share as the code, links and all the materials exist already on Github link that Hasan provided on his website, I have just used his code to see if it will work with me or not, and it didn't work for the reason that on my error consul of the Visual Studio Code, thanks any way for your help!
Hi Friend, What is the helpers directory?
You dont need to create anything, just install the packages and run. dont change anything.Â
helpers.py is a module, not a directory.
Â
Â
@zimbabox okay, but your use of the code that @Hasan provided isn't correct and the pictures and detail you've given isn't enough to help you. So, please share a public link to your code if you want help. You can feel free to schedule a 1-on-1 with me if you wish using the calendar link in my signature.
Regards,
Earnie Boyd, CEO
Seasoned Solutions Advisor LLC
Schedule 1-on-1 help
Join me on Slack
Thanks for your follow up on my issue. The issue is fixed now and I am into a new issue now! I am trying to implement the tutorial "LLM Generic Function" and the code is not generating any output for some reason, I think it's outdated.
import openai import nlpcloud import cohere """ LLMs: OpenAI NlpCloud Cohere You can add your own """ def llm_generate_text(prompt,service,model): if service == 'OpenAI': generated_text = openai_generate(prompt,model) elif service == 'NlpCloud': generated_text = nlp_cloud_generate(prompt,model) elif service == 'Cohere': generated_text = cohere_generate(prompt,model) return generated_text #Open AI Function openai.api_key = "sk-RNm6d9" def openai_generate(user_prompt,selected_model): completion = openai.chat.completions.create( model=selected_model, messages=[ {"role": "user", "content": user_prompt} ] ) return completion.choices[0].message.content #nlpCloud Function nlp_cloud_key = "bc1dc0031" def nlp_cloud_generate(user_prompt,selected_model): client = nlpcloud.Client(selected_model, nlp_cloud_key, gpu=True, lang="en") result = client.generation( user_prompt, min_length=0, max_length=100, length_no_input=True, remove_input=True, end_sequence=None, top_p=1, temperature=0.8, top_k=50, repetition_penalty=1, length_penalty=1, do_sample=True, early_stopping=False, num_beams=1, no_repeat_ngram_size=0, num_return_sequences=1, bad_words=None, remove_end_sequence=False ) return result["generated_text"] #Cohere API cohere_api_key = "KXSfvlpAXW" def cohere_generate(user_prompt,selected_model): co = cohere.Client(cohere_api_key) # This is your trial API key response = co.generate( model=selected_model, prompt=user_prompt, max_tokens=300, temperature=0.9, k=0, stop_sequences=[], return_likelihoods='NONE') return response.generations[0].text
Thanks for your follow up on my issue. The issue is fixed now and I am into a new issue now! I am trying to implement the tutorial "LLM Generic Function" and the code is not generating any output for some reason, I think it's outdated.
https://github.com/almutabbil/LLM_Generic_Function/tree/main
@zimbabox if you add your app.py (or whatever you called it) file to the repository I can try to help debug it.
Regards,
Earnie Boyd, CEO
Seasoned Solutions Advisor LLC
Schedule 1-on-1 help
Join me on Slack
I am on the "Generate Ideas lesson"Â and I am getting this error, why is that?:Â
@zimbabox you've given "service" as the service which isn't expected so you get an exception. If you review the code in llm.py you'll see the filter for the expected service, give the function one of those. While you're at it "command" probably isn't what you want either.
Regards,
Earnie Boyd, CEO
Seasoned Solutions Advisor LLC
Schedule 1-on-1 help
Join me on Slack
@zimbabox the service must be one of the defined ones in the function. please review the "llm_generate_text" function.
-
I am getting 400 Bad Request when testing API using Thunder Client
5 months ago
-
Connect Python With OpenAI API
1 year ago