undefined cover
undefined cover
Using LLMs to Generate Source Code: Part 2 cover
Using LLMs to Generate Source Code: Part 2 cover
The Prompt Desk

Using LLMs to Generate Source Code: Part 2

Using LLMs to Generate Source Code: Part 2

17min |14/02/2024
Play
undefined cover
undefined cover
Using LLMs to Generate Source Code: Part 2 cover
Using LLMs to Generate Source Code: Part 2 cover
The Prompt Desk

Using LLMs to Generate Source Code: Part 2

Using LLMs to Generate Source Code: Part 2

17min |14/02/2024
Play

Description

This episode discusses improving code generation with LLMs by dynamically extracting functions and context, using techniques like Python's inspect modules and JavaScript's toString. The goal is to produce code that integrates well with existing projects, enhancing customization and practicality.

Continue listening to The Prompt Desk Podcast for everything LLM & GPT, Prompt Engineering, Generative AI, and LLM Security.

Check out PromptDesk.ai for an open-source prompt management tool.

Check out Brad’s AI Consultancy at bradleyarsenault.me

Add Justin Macorin and Bradley Arsenault on LinkedIn.

Please fill out our listener survey here to help us create a better podcast: https://docs.google.com/forms/d/e/1FAIpQLSfNjWlWyg8zROYmGX745a56AtagX_7cS16jyhjV2u_ebgc-tw/viewform?usp=sf_link


Hosted by Ausha. See ausha.co/privacy-policy for more information.

Description

This episode discusses improving code generation with LLMs by dynamically extracting functions and context, using techniques like Python's inspect modules and JavaScript's toString. The goal is to produce code that integrates well with existing projects, enhancing customization and practicality.

Continue listening to The Prompt Desk Podcast for everything LLM & GPT, Prompt Engineering, Generative AI, and LLM Security.

Check out PromptDesk.ai for an open-source prompt management tool.

Check out Brad’s AI Consultancy at bradleyarsenault.me

Add Justin Macorin and Bradley Arsenault on LinkedIn.

Please fill out our listener survey here to help us create a better podcast: https://docs.google.com/forms/d/e/1FAIpQLSfNjWlWyg8zROYmGX745a56AtagX_7cS16jyhjV2u_ebgc-tw/viewform?usp=sf_link


Hosted by Ausha. See ausha.co/privacy-policy for more information.

Share

Embed

You may also like

Description

This episode discusses improving code generation with LLMs by dynamically extracting functions and context, using techniques like Python's inspect modules and JavaScript's toString. The goal is to produce code that integrates well with existing projects, enhancing customization and practicality.

Continue listening to The Prompt Desk Podcast for everything LLM & GPT, Prompt Engineering, Generative AI, and LLM Security.

Check out PromptDesk.ai for an open-source prompt management tool.

Check out Brad’s AI Consultancy at bradleyarsenault.me

Add Justin Macorin and Bradley Arsenault on LinkedIn.

Please fill out our listener survey here to help us create a better podcast: https://docs.google.com/forms/d/e/1FAIpQLSfNjWlWyg8zROYmGX745a56AtagX_7cS16jyhjV2u_ebgc-tw/viewform?usp=sf_link


Hosted by Ausha. See ausha.co/privacy-policy for more information.

Description

This episode discusses improving code generation with LLMs by dynamically extracting functions and context, using techniques like Python's inspect modules and JavaScript's toString. The goal is to produce code that integrates well with existing projects, enhancing customization and practicality.

Continue listening to The Prompt Desk Podcast for everything LLM & GPT, Prompt Engineering, Generative AI, and LLM Security.

Check out PromptDesk.ai for an open-source prompt management tool.

Check out Brad’s AI Consultancy at bradleyarsenault.me

Add Justin Macorin and Bradley Arsenault on LinkedIn.

Please fill out our listener survey here to help us create a better podcast: https://docs.google.com/forms/d/e/1FAIpQLSfNjWlWyg8zROYmGX745a56AtagX_7cS16jyhjV2u_ebgc-tw/viewform?usp=sf_link


Hosted by Ausha. See ausha.co/privacy-policy for more information.

Share

Embed

You may also like