Dentist in US is the worst industry I've ever seen...., they are kind of not 'Doctor' or "physician" who treat you as patient, but like auto dealer or car repair.
Check you insurance or 'state ADA commissioner', each state has regulation for such dental bill; white people are well-aware about it.
Check you insurance or 'state ADA commissioner', each state has regulation for such dental bill; white people are well-aware about it.