I’m aware that, at the moment, Florida is deemed unsafe for people to travel to, but what is generally the worst state to live in? Factors such as education, religious extremism, crime, cost of food, healthcare, and access to other resources are relevant. Will “Deep South” states remain one of the worst places to live in due to traditionalism, or are more progressive states bound to grow worse?
The news about Florida exaggerates life here. It’s not at all like you see in the headlines or whatever the “Floridaman” posts you always see. Life is fine. There’s good and bad. Painting it as some “DO NOT ENTER - UNSAFE TERRORIST ZONE” is utter bullshit.
That being said, it would be great if New Yorkers and New Jerseyites fucked off back to their states and stopped buying all the properties here.
Are you LGBT?
Tell that to anyone in the LGBTQ+. Or anyone seeking an abortion. Or any child wanting to learn about systemic racism. Or anyone openly admitting to being atheist in school. Or anyone refusing to recite the pledge of allegiance.
The news about Florida is SPOT-FUCKING-ON, and is why we move from there six years ago to find a better life.
Yeah but when being trans may mewb tiyr kids will get taken from you its a do not enter state for you
Pretty privileged thing to say… Maybe when you and I are the ones they start creating laws against existing, then it’ll actually be bad?