Getting the jitters on a first date? Heart pounding at a nervous rate before a major interview? The students at Stanford University may have a solution to this, so that no one needs to rack their brains for the right words.
They have developed a pair of glasses that “listens to your conversation and tells you exactly what to say next.” In a tweet, Stanford student Bryan Hua-Ping Chiang explains the newly developed prototype.
In a video demonstration of the capabilities of the AI/AR glasses, the Stanford researchers can be seen as unable to control their delight. Alix poses questions to Varun who’s wearing the AR monocle. The glasses interpret the question and with a bit of delay, transcribe the answer and display it on the glass screen, which is then read out by Varun.
The glasses are based on OpenAI’s Whisper, a speech recognition large language model, and Monocle AR glasses provided by Brilliant Labs. The glasses also have a microphone, a high-resolution display, and a camera.
The AI glasses ‘rizzGPT’ communicate via Bluetooth with a web application on the host device, which could be the user’s phone. And when the user speaks or engages in conversation, the audio is converted to text on a real-time basis. OpenAI’s Whisper then allows the glasses to feed the chat to the chatbot, which then suggests answers back to the user.
Leave a Reply