Disclosure

The article may contain affiliate links from one or more partners. Learn how we make money to continue our financial wellness mission.

Disclosures are Federal and State laws that require companies to give consumers information on the products and services or the financial condition of a company.

For investors, disclosures help make informed investment decisions about the company’s securities.

For credit applicants, lenders are required to disclose to borrowers the terms of the credit extended.