Kategorie
HellCat hackers go on a worldwide Jira hacking spree
Veeam and IBM Release Patches for High-Risk Flaws in Backup and AIX Systems
How to Protect Your Business from Cyber Threats: Mastering the Shared Responsibility Model
Six Governments Likely Use Israeli Paragon Spyware to Hack IM Apps and Harvest Data
Why Continuous Compliance Monitoring Is Essential For IT Managed Service Providers
Why you’ll speak 20 languages by Christmas
I live in the future, at least as far as language translation technology is concerned.
During the past couple of months, I’ve spent most of my time in Italy and Mexico. During all that time, I understood Italian and Spanish — thanks to the Live Translation feature of my Ray-Ban Meta glasses.
Announced in September, “Live Translation” is based on Meta’s Llama 3.2 AI model and is currently limited to US and Canada users enrolled in Meta’s Early Access Program.
The feature translates audible French, Spanish, and Italian into audible English in the glasses and typed English on the app — and shows the wearer’s English translated into the selected language.
When I first arrived at the Catania airport in Sicily, I turned on Live Translation by saying, “Hey, Meta: Start Live Translation.”
The first thing I heard using this feature was airport employees directing travelers. They spoke in Sicilian-accented Italian, but I heard: “European passport holders please enter this line; all others go here.”
From that point on, I turned on Live Translation from time to time and was able to understand simple things people might be telling me. In a few cases, I translated my own words into Italian (first speaking in English, then reading the translation in the app in Italian).
It’s not perfect. It also translates English into English (and sometimes mistranslates English to English). It can fail to translate words spoken nearby. At other times, it will translate words spoken across the room when people are talking to each other, not to me.
Ray-Ban Meta glasses also do another neat translation trick. While using Live AI, another Early Access feature, you can look at a sign in a foreign language and ask what it means in English, and it will speak the English translation.
Despite the language glitches, this is a clear glimpse of the future for all of us — the very near future.
Apple AirPodsBloomberg reported on March 13 that Apple will add live language translation to iOS 19 for AirPod users.
According to the report, the user’s AirPods capture foreign language speech and speak the English translation into the ears of the AirPod wearer. Then, when the user speaks English, the iPhone speaker plays the translation into the foreign language via Apple’s Translate app.
The feature is expected to be announced at Apple’s Worldwide Developers Conference (WWDC) in June and released in the fall.
The languages to be supported have not been reported, but Apple’s Translate app supports 20. And Apple is by no means first to market with language translation earbuds.
Google Pixel BudsGoogle has included live translation through its Pixel Buds and Pixel Buds Pro earbuds since October 2017.
The feature does what I described for the Apple AirPods: It delivers translated foreign-language speech through the Pixel Buds while outputting translated English words through the phone speaker. That’s what happens in Conversation Mode. When users switch to Transcribe Mode, they can get a live transcription of the translated foreign language, which is useful for listening to business presentations, attending speeches, or watching movies.
The Pixel Buds’ language translation feature works via the excellent Google Translate app. In Conversation Mode, it supports more than 100 languages; Transcribe Mode, however, only supports four languages: French, German, Italian, and Spanish.
Language translation requires an Android device running Android 6.0 or later that’s Google Assistant-enabled, including non-Pixel phones. However, if you do have an advanced Pixel phone, the translation gets much better.
Compatible Pixel phones (especially models with a Tensor processor) offer Live Translate with text messages, through the camera, in videos, and even during phone calls.
A world of translation productsLanguage translation features that go in the ears come in many varieties.
The TimeKettle WT2 Edge/W3 is highly rated. It supports 40 online languages and 13 pairs of offline languages, enabling two-way simultaneous translation that eliminates the need for alternating speech patterns. The system achieves up to 95% translation accuracy through its AI platform, according to the company.
The Vasco Translator E1 supports an impressive 51 languages and uses 10 different AI-powered translation engines. The system allows up to 10 people to join conversations using the mobile app.
The Pilot by Waverly Labs translates the wearer’s words to others and also translates replies back to the wearer’s language.
Smart glasses that translate are also available.
- The Solos AirGo 3 Smart Glasses perform real-time language translation via the SolosTranslate platform and OpenAI’s ChatGPT.
- Brilliant Labs’ Frame AI Glasses are open-source AR glasses that can translate languages seen in the environment, recognize images and provide information about them, and search the internet for results. The glasses use augmented reality to display translations directly in the user’s field of vision. They integrate with OpenAI, Whisper, and Perplexity technologies.
- TCL AR Glasses can live-translate conversations, offering an integrated heads-up display for showing the translation.
Other form-factors exist, too, including the TimeKettle X1, K&F Concept Language Translator Device, ili Wearable Translator, Vasco Translator E1, TimeKettle WT2 Edge, and Timekettle ZERO Language Translator.
All these products demonstrate that the technology for traveling the world and being able to hold conversations, read signs and understand people in foreign languages is already here, and has been for a while.
Going mainstreamWhat’s about to change is the arrival of this feature in totally mainstream products. Something like 100 million people use their Apple AirPods almost every day. Meta expects to sell more than 10 million Ray-Ban Meta glasses by the end of 2026, by which time Live Translation and Live AI will be offered to all users globally.
What’s really happening is that we’re heading for a world in which every wearable speaker — earbuds, headphones, smart glasses, and more — will give us live language translation on command or even automatically.
The worst thing about this emerging trend is that, in the future, far fewer people will bother to learn foreign languages, relying instead on AI.
But the upside is that language barriers between people on our planet will be essentially erased, and people will more easily understand one another. That’s got to be a good thing.
In the meantime, live translation tech has been a radical and welcome game-changer for me as I travel the world as a digital nomad. Partnering with AI, I can speak foreign languages I never learned.
CISA Adds NAKIVO Vulnerability to KEV Catalog Amid Active Exploitation
CERT-UA Warns: Dark Crystal RAT Targets Ukrainian Defense via Malicious Signal Messages
Malware campaign 'DollyWay' breached 20,000 WordPress sites
Nvidia, xAI and two energy giants join genAI infrastructure initiative
An industry generative artificial intelligence (genAI) alliance, the AI Infrastructure Partnership (AIP), on Wednesday announced that xAI, Nvidia, GE Vernova, and NextEra Energy were joining BlackRock, Microsoft, and Global Infrastructure Partners as members. But given that the announcement specified no financial commitments or any other details, analysts doubted it would make much of a difference.
Still, even though the massive global momentum behind genAI is unlikely to be changed by the announcement, the addition of the two energy companies to the group was an implicit acknowledgement that the ever-increasing power requirements of genAI data centers need serious attention.
Scott Bickley, advisory fellow at Info-Tech Research Group, said that the massive resources behind this initiative, including Blackrock, which reported in January that it held assets worth $11.6 trillion, making it the world’s largest money manager, can make the difference.
Kali Linux 2025.1a released with 1 new tool, annual theme refresh
Pennsylvania education union data breach hit 500,000 people
Ukrainian military targeted in new Signal spear-phishing attacks
Microsoft Exchange Online outage affects Outlook web users
New Arcane infostealer infects YouTube, Discord users via game cheats
Microsoft fixes Windows update bug that uninstalled Copilot
Click Profit blocked by the FTC over alleged e-commerce scams
WhatsApp patched zero-click flaw exploited in Paragon spyware attacks
Hackers Exploit Severe PHP Flaw to Deploy Quasar RAT and XMRig Miners
Demand for VR headsets remains low
Sales of virtual reality (VR) headsets fell by 12% in 2024 compared to the previous year, according to a new report from analyst firm Counterpoint. The decline markers the third year in a row sales have fallen — and it is mainly on the consumer side that demand is low.
The best performer is Meta, which has a 77% market share, followed by Sony, Pico, DPVR, and Apple.
When it comes to Apple, interest in the pricey Vision Pro has increased among business users. But the headset, which went on sale just over a year ago, is still only available in 13 countries and territories.
Counterpoint expects demand for VR headsets to remain low this year, though interest in smart glasses with augmented reality (AR) capabilities is expected to increase significantly.
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- …
- následující ›
- poslední »
