LOQUATICS NEWS READER

[ad_1]

“What are the lowest risk use cases to apply AI speech translation at scale in a healthcare setting?” This question kicked off the Speech Translation panel at SlatorCon Silicon Valley 2024, which featured Oddmund Braaten, CEO of Interprefy, Fardad Zabetian, CEO of KUDO, and Jeremy Woan, Chairman and CEO of CyraCom International.

“That’s a challenging question because I do have a different perspective,” said CyraCom’s Woan. “If a healthcare provider decides to be non-compliant and use machine interpretation, that’s their decision and their risk. For me, it would be troubling for an LSP to give that recommendation. What might, at first, seem a low-risk situation could easily become higher risk very quickly.”

For KUDO’s Zabetian, “our bread and butter is learning and development. Then, of course, it’s language accessibility.” The company continues to assess the fast-growing number of new use cases that come with speech-to-speech AI. KUDO reported that while 25% of meetings used AI in this past quarter, human interpretation bookings also increased by up to 14%, showing that both modes can co-exist and create new use cases.

“If a healthcare provider decides to be non-compliant and use machine interpretation, that is their decision and their risk.” — Jeremy Woan, Chairman and CEO, CyraCom International

Slator’s Managing Director, Florian Faes, followed up with a question to the panelists on how interpreting demand has changed in the past year.

SCSV24 - AI Speech Translation PanelSCSV24 - AI Speech Translation Panel

Interprefy’s Braaten said, “half of it is online and half of it is hybrid and on-site. However, […] what surprised me [when] we launched AI speech [is that] on-site meetings [use] proportionally more AI interpreting than online [meetings].”

Braaten described how a recent non-governmental organization decided to leverage the company’s AI interpreting to maximize accessibility while containing costs. “They were very happy. […] [Without AI,] they would have just gone for Spanish and not the other languages. That’s what we’re also seeing. It expands the number of languages. […] In the end, they get, let’s say, more for their money,” Braaten added.

“On-site meetings use proportionally more AI interpreting than online meetings.” — Oddmund Braaten, CEO, Interprefy

Demand set to keep growing

When asked about five-year predictions for speech interpreting, the three CEOs are optimistic.

While CyraCom’s Woan believes that regulations “won’t change quick enough to allow large-scale adoption of machine interpretation in regulated industries”, he believes that the availability and affordability of remote interpreting will drive up demand.

Referencing a future “correction layer as good as a great interpreter”, Interprefy’s Braaten predicts that quality improvements in the technology will unlock more demand. “There are huge opportunities to grow the market without this handbrake [on quality],” he added.

“We’re going to see voice cloning and lip-syncing, all of that good stuff within the experience of live sessions and live meetings.” — Fardad Zabetian, CEO, KUDO

As for KUDO’s Zabetian, AI speech translation “is going to really become a lot more personalized. […] We’re going to see voice cloning and lip-syncing, all of that good stuff within the experience of live sessions and live meetings.” 

Zabetian also believes it will increase language accessibility. “It’s just a very exciting moment. We can predict that [AI speech translation is] going to give us much a better personal and professional experience,” he concluded.

[ad_2]

Source link

News provided by