Logo

menu bar

Auto Body Warranty on Repairs

Posted: February 1st, 2017 By Louie Sharp

Last week I talked about insurance steering. As a reminder it is a Federal Offense for an insurance company to tell you to take your car to get it fixed after an accident.

One of the things insurance companies will tell you is that if you don’t take it to the shop they recommend or tell you to take it too you won’t get a warranty on the repairs. This is NOT true. The insurance company NEVER warranties the repairs. The warranty is always provided by the shop that does the repairs. Ask the shop what type of warranty they provide and how long is the warranty good for. If you have issues after the repair is over and you call the insurance company they will tell you to call the shop that fixed the vehicle . It happens every time.

Tags: , , , ,

Comments are closed.