Understanding *args and **kwargs with unpacking in Python

Introduction For a very long time, I have been confused about *args and **kwargs in python. This is a short summary of what these terms mean. Before I go deep into them, I have to make a clear distinction between argument and parameters. For so long, I have been confused about them and used them interchangeably. In theory: Parameter: These are placeholder values used while a function is defined. Arguments: These are the actual values passed into the function when it is called. ...

April 21, 2024 · 4 min · 837 words · Kaushal Bundel

Understanding ChatGpt Prompt Engineering- Part V - Using ChatGPT for Transforming text

Basic Usage of LLM’s #loading the boiler plate code import openai import os from dotenv import load_dotenv, find_dotenv #library to load the local environment variables jupyter _=load_dotenv(find_dotenv()) api_key= os.getenv("OPENAI_API_KEY") #creating the basic prompting function client = openai.OpenAI() def get_completion(prompt, model = "gpt-3.5-turbo"): messages = [{"role": "user", "content":prompt}] response = client.chat.completions.create( model = model, messages = messages, temperature = 0 ) return response.choices[0].message.content LLM as a Transforming device Consider the below text in Sanskrit. We can translate this text in the language for our choice with the tonal quality that we aspire. text = f""" योगस्थः कुरु कर्माणि सङ्गं त्यक्त्वा धनञ्जय। सिद्ध्यसिद्ध्योः समो भूत्वा समत्वं योग उच्यते।। """ prompt = f""" Please translate the sanskrit text {text} in the hindi and english language. Use the following rules to make the translation: 1. Output the original text and the both translations in the form of python list 2. The tone of the translation should be warn, friendly and personal. """ response = get_completion(prompt) print(response) ```python original_text = ["योगस्थः कुरु कर्माणि सङ्गं त्यक्त्वा धनञ्जय।", "सिद्ध्यसिद्ध्योः समो भूत्वा समत्वं योग उच्यते।।"] hindi_translation = ["योगस्थः कर्म करो, संग से दूर होकर धनंजय।", "सिद्धि और असिद्धि में समान बनकर समता को योग कहा जाता है।"] english_translation = ["Stay focused on your actions while letting go of attachment, Arjuna.", "Being equal in success and failure is called yoga."] translations = [original_text, hindi_translation, english_translation] print(translations) ``` original_text = ["योगस्थः कुरु कर्माणि सङ्गं त्यक्त्वा धनञ्जय।", "सिद्ध्यसिद्ध्योः समो भूत्वा समत्वं योग उच्यते।।"] hindi_translation = ["योगस्थः कर्म करो, संग से दूर होकर धनंजय।", "सिद्धि और असिद्धि में समान बनकर समता को योग कहा जाता है।"] english_translation = ["Stay focused on your actions while letting go of attachment, Arjuna.", "Being equal in success and failure is called yoga."] translations = [original_text, hindi_translation, english_translation] print(translations) [['योगस्थः कुरु कर्माणि सङ्गं त्यक्त्वा धनञ्जय।', 'सिद्ध्यसिद्ध्योः समो भूत्वा समत्वं योग उच्यते।।'], ['योगस्थः कर्म करो, संग से दूर होकर धनंजय।', 'सिद्धि और असिद्धि में समान बनकर समता को योग कहा जाता है।'], ['Stay focused on your actions while letting go of attachment, Arjuna.', 'Being equal in success and failure is called yoga.']] Expanding the given translation into a socratic conversation. ...

March 26, 2024 · 8 min · 1550 words · Kaushal Bundel

Understanding ChatGpt Prompt Engineering- Part IV - Using ChatGPT for Inference tasks

Basic Usage of LLM’s Loading the boiler plate code import openai import os from dotenv import load_dotenv, find_dotenv #library to load the local environment variables jupyter _=load_dotenv(find_dotenv()) api_key= os.getenv("OPENAI_API_KEY") #creating the basic prompting function client = openai.OpenAI() def get_completion(prompt, model = "gpt-3.5-turbo"): messages = [{"role": "user", "content":prompt}] response = client.chat.completions.create( model = model, messages = messages, temperature = 0 ) return response.choices[0].message.content LLM as a Inference Device Inference is essentially extraction of specific property of the text fed. These could be sentiments, specific key-value pairs, tone, labels, names etc. The application of this function can be very wide, I hope to illustrate a few examples here. ...

March 26, 2024 · 7 min · 1395 words · Kaushal Bundel

Understanding ChatGpt Prompt Engineering- Part III - Using ChatGPT for summarization tasks

Basic Usage of LLM #loading the boiler plate code import openai import os from dotenv import load_dotenv, find_dotenv #library to load the local environment variables jupyter _=load_dotenv(find_dotenv()) api_key= os.getenv("OPENAI_API_KEY") #creating the basic prompting function client = openai.OpenAI() def get_completion(prompt, model = "gpt-3.5-turbo"): messages = [{"role": "user", "content":prompt}] response = client.chat.completions.create( model = model, messages = messages, temperature = 0 ) return response.choices[0].message.content Using ChatGPT for summarization tasks This is the most common usecase and I am sure most of the people are using ChatGPT for this activity. But there are subtle tricks that one can use to enhance the experience of summarization. ...

March 24, 2024 · 11 min · 2243 words · Kaushal Bundel

Understanding ChatGpt Prompt Engineering- Part II - An Iterative Approach

Prompt Engineering as an Iterative process There is not single effective method for getting the desired results from the a LLM. For general tasks the process of getting response to a desired problem is simple, whereas for specific problems one needs to used iterative methods to arrive at desired results. In addition sometime we just do not know what the final result will look like, in such situations moulding the LLM response is essential. ...

March 21, 2024 · 7 min · 1297 words · Kaushal Bundel