Close Menu
Tech Savvyed
  • Home
  • News
  • Artificial Intelligence
  • Gadgets
  • Apps
  • Mobile
  • Gaming
  • Accessories
  • More
    • Web Stories
    • Spotlight
    • Press Release

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

What's On
Forget smart glasses, these earbuds can see, hear, and remember everything for you

Forget smart glasses, these earbuds can see, hear, and remember everything for you

15 May 2026
Meta will allow third-party apps for Ray-Ban Display glasses. Your eyes must stay glued to digital reality.

Meta will allow third-party apps for Ray-Ban Display glasses. Your eyes must stay glued to digital reality.

15 May 2026
You can’t block Meta’s AI bot on Threads. I don’t know what we did to deserve this.

You can’t block Meta’s AI bot on Threads. I don’t know what we did to deserve this.

15 May 2026
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
Tech Savvyed
SUBSCRIBE
  • Home
  • News
  • Artificial Intelligence
  • Gadgets
  • Apps
  • Mobile
  • Gaming
  • Accessories
  • More
    • Web Stories
    • Spotlight
    • Press Release
Tech Savvyed
Home»News»OpenAI’s new voice AI can listen, think, and talk back in 70+ languages
News

OpenAI’s new voice AI can listen, think, and talk back in 70+ languages

News RoomBy News Room8 May 20262 Mins Read
OpenAI’s new voice AI can listen, think, and talk back in 70+ languages
Share
Facebook Twitter Reddit Telegram Pinterest Email

OpenAI has launched three new audio models in its Realtime API, and they are a big deal for anyone building voice-powered apps. The three models are GPT-Realtime-2, GPT-Realtime-Translate, and GPT-Realtime-Whisper. 

Together, they move voice AI beyond simple back-and-forth responses toward something that can understand you, take action, and keep up with a real conversation.

If their demo is anything to go by, we have just seen the next evolution in how voice AI models work. 

So what can these models actually do?

GPT-Realtime-2 is the headline act. It brings GPT-5-class reasoning to live voice interactions, meaning it can handle harder requests without dropping the thread of the conversation. 

It can call multiple tools simultaneously and even narrate what it’s doing with phrases like “checking your calendar” or “let me look into that.” It also has a larger context window of 128K tokens, which means longer, more coherent sessions. Developers can even adjust the reasoning effort based on the complexity of the request.

GPT-Realtime-Translate is probably my favorite. It’s the closest we have come to having Star Trek’s Universal Translator in real life. It supports live speech translation across 70+ input languages and 13 output languages. 

The best part of the demo was that even when a new person joined and spoke a different language, GPT-Realtime-Translate had no issues in translating both speakers into English in real time. 

Finally, there’s the GPT-Realtime-Whisper. Most speech-to-text models wait for the speaker to finish before providing the full translation. This one is a streaming transcription model that converts speech to text as the speaker talks. It is useful for live captions, meeting notes, and any voice-powered workflow where waiting for a transcription is not an option.

Can anyone use these new voice AI models?

Currently, OpenAI has released these models for developers. But the apps they build will affect everyone. For example, a developer can build a real-time translator app, allowing users to converse with people in different languages. 

Many companies are already testing these new models. Zillow is building a voice assistant that can search homes and schedule tours from a single spoken request. Priceline can check your flights and hotels, cancel them, and book new ones. Vimeo is using it for real-time transcription, and so on. 

new voice ai model working inside Priceline

Pricing starts at $0.017 per minute for Whisper, $0.034 per minute for Translate, and $32 per 1M audio input tokens for GPT-Realtime-2.

Share. Facebook Twitter Pinterest LinkedIn Telegram Reddit Email
Previous ArticleEven brief AI use could hurt your ability to think, a new study finds
Next Article Nintendo is raising Switch 2 price in the US, but there’s still time left to snag one for less

Related Articles

Forget smart glasses, these earbuds can see, hear, and remember everything for you

Forget smart glasses, these earbuds can see, hear, and remember everything for you

15 May 2026
Meta will allow third-party apps for Ray-Ban Display glasses. Your eyes must stay glued to digital reality.

Meta will allow third-party apps for Ray-Ban Display glasses. Your eyes must stay glued to digital reality.

15 May 2026
You can’t block Meta’s AI bot on Threads. I don’t know what we did to deserve this.

You can’t block Meta’s AI bot on Threads. I don’t know what we did to deserve this.

15 May 2026
Fluffy robot seals are being used for mental health care at a UK hospital

Fluffy robot seals are being used for mental health care at a UK hospital

15 May 2026
AMD brings Zen 5 and 3D V-Cache to Ryzen Pro 9000 series workstation chips

AMD brings Zen 5 and 3D V-Cache to Ryzen Pro 9000 series workstation chips

15 May 2026
I was skeptical about the Motorola Razr Fold, but it rose above the first-gen curse handsomely

I was skeptical about the Motorola Razr Fold, but it rose above the first-gen curse handsomely

15 May 2026
Demo
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo
Don't Miss
Meta will allow third-party apps for Ray-Ban Display glasses. Your eyes must stay glued to digital reality.

Meta will allow third-party apps for Ray-Ban Display glasses. Your eyes must stay glued to digital reality.

By News Room15 May 2026

Meta announced its Meta Ray-Ban Display glasses with a built-in in-lens display that allows users…

You can’t block Meta’s AI bot on Threads. I don’t know what we did to deserve this.

You can’t block Meta’s AI bot on Threads. I don’t know what we did to deserve this.

15 May 2026
Fluffy robot seals are being used for mental health care at a UK hospital

Fluffy robot seals are being used for mental health care at a UK hospital

15 May 2026
AMD brings Zen 5 and 3D V-Cache to Ryzen Pro 9000 series workstation chips

AMD brings Zen 5 and 3D V-Cache to Ryzen Pro 9000 series workstation chips

15 May 2026
Tech Savvyed
Facebook X (Twitter) Instagram Pinterest
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact
© 2026 Tech Savvyed. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.