July 11, LLM language app update

steven songsteven song
1 min read

Progress has been made!

So I started out with just a simple rock paper scissors app where I can type rock paper or scissor into a text prompt and then the LLM will respond with rock paper or scissors and a cool quote. That was working pretty well.

Features I’ve added:

  • text to speech for AI response

  • voice input for conversation

  • voice to text to send to LLM.

What it looks like. There is a record button then you say what you want to choose then it will respond. It’s working pretty well. The openAi api’s work very well with sufficient prompt engineering.

Things I want to work on next:

  • add different languages by the LLM.

  • have the user be able to speak in a different language.

  • add scenarios for conversation learning ex. ordering at a cafe

  • host onto cloud so you all can test it out!

Thanks for following.

0
Subscribe to my newsletter

Read articles from steven song directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

steven song
steven song