Do you guys have mandatory First aid courses in America?
I believe most schools have a health course that's mandatory to take that teaches basic CPR. I know I took it like 5 years ago.
I haven't took my CPR class yet smh.
That's because you live in the South, and the Southern educational system is arguably worse than the Northern states lmao.
Edit: Oh shit boys I finally got Colonel!!!
Wow that is rude & not really Atlanta boys have a great education system. However, I live in the woods so you are right somewhat. xD
That's why I said -arguably- xD. I just know that from hearing it from previous people. Like West Virginia isn't THATTTT south, but it's still a bit south. A friend of mine lived there and he would frequently ask me for help on his papers, and he would show me an essay he had to do for his SENIOR year of Highschool, and that shit was like barebones 5 sentences 3 paragraphs and he said that would get him an A. Like, if I did that I would get an F in my system lmao.