Home Discover Policies

Community Guidelines


At VideoMatch, we are on a mission to inspire millions of users across the globe to foster meaningful connections without fear of rejection, judgment, and safety.

We are committed to giving users a safe and secure environment to connect with others and express themselves by building and protecting a one-to-one space that respects your privacy and safety. That said, we are aware the online world can also bring instances of misuse and abusive behaviour, and as such, we will take action based on our community guidelines and company values.

Our community guidelines define what is what is not allowed on the platform. They are set to encourage an open, authentic, and healthy discussion which contribute to building a safe environment where you can be yourself. Safety and trust are our number one priority, and we work hard every day to improve our technology and enforcement efforts to prevent and/or remove harmful content and behaviour whenever we become aware of it. In addition, we encourage you to report inappropriate behaviour or content you may come across so we can review it against our guidelines.

To help you report on content which you think may violate these guidelines, please follow the link on “How to Report” in our Safety Centre. Every time you flag something to us you help us make our community safer. Thank you for that!


Minors are not allowed on our platforms. We leverage state-of-the-art detection technology and our human resources to keep underage age users off our platform. Our age verification process uses artificial technology and a KYC review that provides us with a complete and holistic analysis of user profiles. Human moderators will review accounts flagged by our automated system or by user reports and take action accordingly.


At VideoMatch, we believe people need to feel safe to build meaningful connections, and we work hard to protect our users, especially minors. We are committed to making our community a safe place where content that promotes harmful or dangerous behaviour isn’t allowed. Our rules are designed to mitigate potential online risks, focusing on child safety, sensitive content, and criminal behaviour.

Violent and Hateful Behaviour:

Adult-themed Content:

Bullying & Harassment:

Child Safety:

Suicide & Self-Harm:

Illegal or Criminal Activities

We reserve the right to report any of these activities to the relevant authorities.

Private Information:


Our global online community is built on trust, and therefore inappropriate content or users who intend to scam, mislead, or deceive other users isn’t permitted nor allowed on our platforms. That’s why we have put in place rules around authenticity, fraudulent activities, and deceptive behaviour.

Account Authenticity:

Name Policy:

We encourage our community to use the name they go by in their daily lives, identify with, or otherwise what is meaningful to them. A name helps others know who they are connecting with while maintaining an authentic and respectful environment.

Deceptive behaviour and Spam:

Photos, Shares, and Video Streaming Guidelines