Should I keep going with my faith?
I was born in religious family background, schooled in private religious school since 6 years old and I'm secretary of religion organization in my university.
The problem is that after seeing how constraining my family was and how heavily indoctrinated I am during highschool after certain politically driven religious rally in my country, to the point I asked myself "is religion really relevant?".
I tried to asked this to everyone I can trust in real life but they just like "shut up and believe" and that's contradicting my logic and common sense