• tchalla an hour ago

    > Notably, this is the second time CEO Aviad Maizels has sold a company to Apple. In 2013, he sold PrimeSense, a 3D-sensing company that played a key role in Apple’s transition from fingerprint sensors to facial recognition on iPhones. Q.ai launched in 2022 and is backed by Kleiner Perkins, Gradient Ventures, and others. Its founding team, including Maizels and co-founders Yonatan Wexler and Avi Barliya, will join Apple as part of the acquisition.

    Twice, well done!

    • tartoran an hour ago

      What kind of tech does qAi bring to the table?

      • causalmodels 38 minutes ago

        " As first reported by Reuters, Apple has acquired Q.ai, an Israeli startup specializing in imaging and machine learning, particularly technologies that enable devices to interpret whispered speech and enhance audio in noisy environments."

        • cyrusradfar 27 minutes ago

          [puts on tin foil]

          you mean something that improves the detection and transcription of voices when the person doesn't realize the mic is on, like when it's in our pocket?

          • Noaidi 25 minutes ago

            Yeah, so, I am never turning on Apple Intelligence...

      • yomansat 17 minutes ago

        It still surprises me how everyone was closing their Russian based stores when they invaded Ukraine, but here's a much worse situation and it's business as usual...

        • myth_drannon 3 minutes ago

          Maybe because no one has Palestinian based stores?

        • Sir_Twist an hour ago

          “Q.ai is a startup developing a technology to analyze facial expressions and other ways for communication.”

          This is an interesting acquisition given their rumored Echo Show / Nest Hub competitor (1). Maybe this is part of their (albeit flawed and delayed) attempt to revitalize the Siri branding under their Apple Intelligence marketing. When you have to say the exact right words to Siri, or else she will add “Meeting at 10” as an all day calendar event, people get frustrated, and that non-technical illusion of the “digital assistant” is lost. If this is the model of understanding Apple have of their customers’ perception of Siri, then maybe their thinking is that giving Siri more non-verbal personable capability could be a differentiating factor in the smart hub market, along with the LLM rebuild. I could also see this tying into some sort of strategy for the Vision Pro.

          Now, whether this hypothetical differentiating factor is worth $2 billion, I’m not so sure on, but I guess time will tell.

          https://www.macrumors.com/2025/11/05/apple-smart-home-hub-20...

          • deepfriedchokes 2 hours ago

            Sounds pretty invasive for privacy, if this was ever paired with smart glasses in public.

            • Lammy an hour ago

              Hence the name, I assume.

              • cyrusradfar 26 minutes ago

                and very expensive domain.

            • clueless 2 hours ago

              Could Q.ai be commercializing the AlterEgo tech coming out of MIT Lab? i.e. "detects faint neuromuscular signals in the face and throat when a person internally verbalizes words"

              Yep, looks like that is it. Recent patent from one of the founders: https://scholar.google.com/citations?view_op=view_citation&h...

              • mikestorrent an hour ago

                Yeah...

                Pardon the AI crap, but:

                > ...in most people, when they "talk to themselves" in their mind (inner speech or internal monologue), there is typically subtle, miniature activation of the voice-related muscles — especially in the larynx (vocal cords/folds), tongue, lips, and sometimes jaw or chin area. These movements are usually extremely small — often called subvocal or sub-articulatory activity — and almost nobody can feel or see them without sensitive equipment. They do not produce any audible sound (no air is pushed through to vibrate the vocal folds enough for sound). Key evidence comes from decades of research using electromyography (EMG), which records tiny electrical signals from muscles: EMG studies consistently show increased activity in laryngeal (voice box) muscles, tongue, and lip/chin areas during inner speech, silent reading, mental arithmetic, thinking in words, or other verbal thinking tasks

                So, how long until my Airpods can read my mind?

              • concavebinator 42 minutes ago

                In case there are any Ender's Game fans here, the capability to understand micro-expressions reminds me of how Ender subvocalizes to Jane. Orson Scott Card predicted yet another technological norm.

                • danhite 3 minutes ago

                  [delayed]

                • stefanos82 an hour ago

                  Why am I having a feeling that one of their reasons was so they can trademark "iQ", to match the iSomething "franchise", so to speak?

                  • gralab an hour ago

                    Apple dropped the "i" naming scheme many years ago.

                    • sgjohnson 39 minutes ago

                      iCloud, iPad, iPhone, iMac, iMessage, iOS/iPadOS, iMovie?

                      Granted, they are slowly but surely killing it, but it’s still going quite strong.

                  • undefined 3 hours ago
                    [deleted]
                    • alecco 42 minutes ago

                      It's kind of sad watching Apple drift into irrelevancy. I know I'm not going to buy more products from them because nothing they have is worth the premium price.

                      • assaddayinh 2 hours ago

                        The ability to impress CEOs and signal hotness to investors, may not corelate at all with the ability to produce breakthrough technology. Thus companies like google grow up unbought to then become ..

                        • bnchrch 2 hours ago

                          Wake me up when they let one of these acqui-hires update Siri to be on par with a voice assistant I could make in an afternoon with off the shelf tools.

                          • alighter 2 hours ago

                            This. And next word prediction / autocorrect that doesn’t look like it’s from the previous century.

                            • tobmlt 2 hours ago

                              On both my nokia and my blackberry it was far far better than on my iphone. That wasn't quite 199X but pretty close.

                              I wish the iphone had word prediction and autocorrect that was from the previous centruy

                              • thewebguyd an hour ago

                                BlackBerry's keyboards & autocorrect were top notch. Nothing has matched it yet when using a pure virtual touch screen keyboard.

                                Crazy he had pretty much perfected the tech of typing out text on a smartphone and then decided to throw it all away by moving to all-screen devices instead. A virtual keyboard with no tactile feel will never compare until we can have screens that can recreate the tactile bumps of a physical keyboard.

                              • darth_avocado 2 hours ago

                                Apple autocorrect has gotten actually worse over the last decade. Before it used to be duck instead of a similar sounding word and it took one action to correct it. Now it’s just fuschia and it takes 5 mins to correct the correction to the autocorrect.

                                • tartoran an hour ago

                                  I agree with this sentiment. It was so annoying that I turned auto correct off. I found that writing on iPhone has got worse as well, or at least it's my own observation. On the other hand, voice dictation has improved quite a bit that I can just dictate into my phone when needed. For more serious work I use a work device not a consumption one.

                              • wahnfrieden 2 hours ago

                                that already made the news. it will be powered by gemini and may launch before next wwdc.