Dental insurance is a contract with an insurance provider to help cover a portion of the cost of dental care for individuals and families. Dental insurance is usually not part of regular health insurance, although it might be an optional add-on.
For more information, check out What is Dental Insurance? A Simple Explanation for Kids, Teens & Beginners