Confronting the CEO of the AI company that impersonated me
Summary
The host, Nilay Patel, interviews Shishir Mehrotra, CEO of Superhuman (formerly Grammarly), focusing heavily on the recent controversy surrounding Grammarly's 'Expert Review' feature. This feature synthesized writing suggestions from AI-cloned "experts," including the host and other journalists, without obtaining prior permission to use their names. Mehrotra apologizes, stating the feature was not good, misaligned with strategy, and was killed shortly after initial complaints, well before a resulting class-action lawsuit filed by Julia Angwin. The discussion delves into the decision-making process, with Mehrotra explaining the intent was to fulfill user desires for expert feedback, but acknowledging the execution failed both users and experts. They debate the ethics of using names/likenesses for commercial purposes versus simple attribution, with Mehrotra arguing the feature was attribution, not impersonation, and that the legal standard is not the one they aim for. The conversation broadens to the extractive nature of AI, the low public perception of AI compared to ICE, and the future creator economy, where Mehrotra advocates for creators to build direct, paid connections (like subscriptions) via platforms like Superhuman Go, rather than relying on fragmented ad revenue.
(Source:The Verge)