Johnny Depp Announces Departure from Hollywood: Criticizes ‘Woke’ Agenda and Cancel Culture

Johnny Depp, renowned actor, and Hollywood icon, has recently made headlines by announcing his departure from the film industry. In a candid interview at the Cannes Film Festival, Depp expressed his reasons for turning his back on Hollywood’s ‘woke’ agenda and the toxic culture of canceling anyone who dares to challenge the mainstream narrative. This…

Read More