What you need to know about Dental Insurance!
Dental insurance is a contract between you or your employer and a dental insurance company. The benefits that you will receive are based on the terms of the contract that were negotiated between you or your employer and the dental insurance company and not your dental office. The goal of most dental insurance policies is to provide only basic care for specific dental services. The services selected are based an the cost of the policy to you or your employer and the negotiated arrangements with the dental insurance company.