I just finished all of The Boys three season and I gotta say, I enjoyed it tremendously. Well except for that mid third season.
Wtf was that actually? I should have seen it coming, it's f-ing Amazon. But what the hell man? Can we ever have something devoid of woke politics in it? Like ever.
Can't things just be, you know, entertainment like it's actually supposed to be. This shit is aggravating.