Apple's iPhone voice-to-text feature is drawing attention for a technical error that occasionally displays "Trump" when users speak the word "racist," Fox News Digital reported Tuesday. The glitch, which first gained traction through a viral TikTok video, has prompted Apple to acknowledge the issue and announce plans for an immediate fix.
@user9586420191789 My dad sent me this video this morning. He told me his friend noticed that when he used speech to text and said "racist," it briefly changed to "Trump" before changing back. Seems like subliminal messaging to me. I don't have an iPhone and my phone doesn't do it. #iphone #Trump #apple #elonmusk #fyp @Anna Matson @Aquarius_Waive @athena @David Gokhshtein @Doxielvr @Hello America @Jason Pargin, author @Jeffery Mead @Jeff Mead @Joe "Pags" Pagliarulo @J.D. Vance @Link Lauren @Tulsi Gabbard @user80861822781 ♬ original sound - Jess White2260
According to Fox News Digital, the outlet was able to replicate the problem multiple times, confirming that the dictation feature briefly shows "Trump" before correcting to "racist" – matching the behavior demonstrated in the widely shared social media post. However, Fox News Digital noted that the error doesn't occur consistently, with the system accurately transcribing "racist" in most instances.
An Apple spokesperson addressed the controversy in a statement to Fox News Digital, saying: "We are aware of an issue with the speech recognition model that powers Dictation, and we are rolling out a fix as soon as possible." The company explained that their speech recognition technology sometimes temporarily displays words with phonetic similarities before settling on the correct term, Fox News Digital reported.

The company further clarified that the bug affects various words containing the "r" consonant during dictation, not just this specific case. Apple indicated that these momentary misinterpretations are part of how the system processes speech before determining the intended word.
The viral TikTok video that exposed the issue shows a user demonstrating how speaking the word "racist" initially generates "Trump" on screen before switching to the correct word. The video's caption suggests this could be "subliminal messaging," though Apple attributes it to a technical error rather than intentional programming.
This incident marks the second recent controversy involving technology platforms and the portrayal of President Donald Trump. Last September, Amazon's Alexa virtual assistant faced scrutiny after users discovered it would provide reasons to vote for then-Vice President Kamala Harris while declining to offer similar information about Trump.
In that case, Fox News Digital reported that Amazon representatives briefed the House Judiciary Committee about the discrepancy, explaining that Alexa uses pre-programmed manual overrides for certain user prompts. The company had initially only created these overrides for then-President Joe Biden and Trump, neglecting to include Harris because few users had requested information about her candidacy.
Amazon reportedly identified and addressed the Alexa issue within hours of the video going viral on social media. According to sources familiar with the matter, the company apologized during the congressional briefing, acknowledging that while Amazon has a policy preventing Alexa from expressing political opinions or bias, "obviously we are here today because we did not meet that bar in this incident."
The source indicated that Amazon has since conducted a comprehensive audit of its system and implemented manual overrides for all candidates and numerous election-related queries, expanding beyond its previous focus on only presidential candidates.
Apple has not specified exactly when the fix for the dictation issue will be deployed to affected devices, but indicated it would be "as soon as possible," according to the company's statement to Fox News Digital.