I just finished all of The Boys three season and I gotta say, I enjoyed it tremendously. Well except for that mid third season.

Wtf was that actually? I should have seen it coming, it's f-ing Amazon. But what the hell man? Can we ever have something devoid of woke politics in it? Like ever.

Can't things just be, you know, entertainment like it's actually supposed to be. This shit is aggravating.
Soldier Boy was the best part of Season 3. But yeah, season 3 is inferior to the first two seasons overall.
 
Top