It seems like no matter where I go, I can’t escape this huge cloud of Hollywood bullshit that is chasing us all into oblivion. You know what I’m talking about.
That fake and shallow California image rich Hebrews and hipster schmucks like to sell us. The coolest place on earth. The wannabe moral compass of America that is ironically the most decadent place since Sodom and Gomorrah. You see it on TV as a kid, you hear about it in your music and travel aficionados just won’t shut the fuck up about how great it is.
I know they have legal weed, amazing sunsets, sexually loose women and fancy expensive Gyms.. but is the City of Angels really worth all the hype?
Seems like anyone who lives there for a decent chunk of time eventually grows disillusioned with the place. They get tired of the political chaos, the crime is too much for them to handle, the broads with fake tits get old and become too toxic for anyone looking to settle. Even Venice Beach is no longer any fun, it’s just another industry meme.
Yet even after understanding all that, I just can’t help but feel like I’m missing out on this weird little planet known as California (or Mexico Lite, your pick).
I want to experience the hypocrisy, the awesome cinematic view, the horrible heat, the nightlife, the blowjobs in the In-N-Out parking lot, the homeless people and even the gangbangers!
L.A. is my love and hate. I lust for it. But I also feel nauseated when I hear it calling out for me. It represents everything I yearn for yet utterly despise.
AND I HAVEN’T EVEN BEEN THERE YET!
West Hollywood brunettes, keep on waiting for me.