Should Sex Education Be Taught in Schools? | Teen Ink

Should Sex Education Be Taught in Schools?

May 22, 2019
By Anonymous

Sex education is important. It's been proven time and time again. We know students who receive formal sex education in schools are shown to first have sexual intercourse later than students who have not had sex education. Sex education does not encourage teenagers to have sex, it does quite the opposite. Most American teenagers are sexually active and think nothing could ever happen to them. But, many of them are misinformed about the risks that are involved in sex. Teens also don't always know the best ways to protect themselves and their partners from becoming pregnant or getting STDs. User  said, the more educated someone is the more likely they are to make responsible and informed choice for their behaviors. Sex education given by teachers at school is the most reliable way to give kids the right information about sex. In schools sex education information is give by professional and has be proven by many reports all over the country and world. Even parents agree, School-based Programs to Reduce Sexual Risk Behavior. Should sex education be taught in schools? I think they should Teach young teens about sex so they can make the right choice.



Similar Articles

JOIN THE DISCUSSION

This article has 0 comments.